datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
parler-tts/mls-eng-10k-tags_tagged_10k_generated | ---
pretty_name: Annotations of 10K hours of English MLS
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- multilingual
paperswithcode_id: multilingual-librispeech
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
- text-to-speech
- text-to-audio
dataset_info:
features:
- name: original_path
dtype: string
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: audio_duration
dtype: float64
- name: speaker_id
dtype: string
- name: book_id
dtype: string
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: string
- name: phonemes
dtype: string
- name: gender
dtype: string
- name: pitch
dtype: string
- name: noise
dtype: string
- name: reverberation
dtype: string
- name: speech_monotony
dtype: string
- name: text_description
dtype: string
- name: original_text
dtype: string
- name: text
dtype: string
splits:
- name: dev
num_bytes: 4378721
num_examples: 3807
- name: test
num_bytes: 4360862
num_examples: 3769
- name: train
num_bytes: 2779317208
num_examples: 2420047
download_size: 1438356670
dataset_size: 2788056791
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
# Dataset Card for Annotations of 10K hours of English MLS
This dataset consists in **annotations of a 10K hours** subset of **[English version of the Multilingual LibriSpeech (MLS) dataset](https://huggingface.co/datasets/parler-tts/mls_eng)**.
MLS dataset is a large multilingual corpus suitable for speech research. The dataset is derived from read audiobooks from LibriVox and consists of
8 languages - English, German, Dutch, Spanish, French, Italian, Portuguese, Polish. It includes about 44.5K hours of English and a total of about 6K hours for other languages.
This dataset includes an annotation of [a 10K hours subset](https://huggingface.co/datasets/parler-tts/mls_eng_10k) of English MLS. Refers to this [dataset card](https://huggingface.co/datasets/facebook/multilingual_librispeech) for the other languages.
The `text_description` column provides natural language annotations on the characteristics of speakers and utterances, that have been generated using [the Data-Speech repository](https://github.com/huggingface/dataspeech).
This dataset was used alongside its [original version](https://huggingface.co/datasets/parler-tts/mls_eng_10k) and [LibriTTS-R](https://huggingface.co/datasets/blabble-io/libritts_r) to train [Parler-TTS Mini v0.1](https://huggingface.co/parler-tts/parler_tts_mini_v0.1).
A training recipe is available in [the Parler-TTS library](https://github.com/huggingface/parler-tts).
## Usage
Here is an example on how to load the only the `train` split.
```
load_dataset("parler-tts/mls-eng-10k-tags_tagged_10k_generated", split="train")
```
Streaming is also supported.
```
load_dataset("parler-tts/libritts_r_tags_tagged_10k_generated", streaming=True)
```
**Note:** This dataset doesn't actually keep track of the audio column of the original version. You can merge it back to the original dataset using [this script](https://github.com/huggingface/dataspeech/blob/main/scripts/merge_audio_to_metadata.py) from Parler-TTS or, even better, get inspiration from [the training script](https://github.com/ylacombe/parler-tts/blob/3c8822985fe6cec482ecf868b04e866428bcd7bc/training/run_parler_tts_training.py#L648) of Parler-TTS, that efficiently process multiple annotated datasets.
### Motivation
This dataset is a reproduction of work from the paper [Natural language guidance of high-fidelity text-to-speech with synthetic annotations](https://www.text-description-to-speech.com) by Dan Lyth and Simon King, from Stability AI and Edinburgh University respectively.
It was designed to train the [Parler-TTS Mini v0.1](https://huggingface.co/parler-tts/parler_tts_mini_v0.1) model.
Contrarily to other TTS models, Parler-TTS is a **fully open-source** release. All of the datasets, pre-processing, training code and weights are released publicly under permissive license, enabling the community to build on our work and develop their own powerful TTS models.
Parler-TTS was released alongside:
* [The Parler-TTS repository](https://github.com/huggingface/parler-tts) - you can train and fine-tuned your own version of the model.
* [The Data-Speech repository](https://github.com/huggingface/dataspeech) - a suite of utility scripts designed to annotate speech datasets.
* [The Parler-TTS organization](https://huggingface.co/parler-tts) - where you can find the annotated datasets as well as the future checkpoints.
### License
Public Domain, Creative Commons Attribution 4.0 International Public License ([CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode))
## Citation
```
@article{Pratap2020MLSAL,
title={MLS: A Large-Scale Multilingual Dataset for Speech Research},
author={Vineel Pratap and Qiantong Xu and Anuroop Sriram and Gabriel Synnaeve and Ronan Collobert},
journal={ArXiv},
year={2020},
volume={abs/2012.03411}
}
```
```
@misc{lacombe-etal-2024-dataspeech,
author = {Yoach Lacombe and Vaibhav Srivastav and Sanchit Gandhi},
title = {Data-Speech},
year = {2024},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/ylacombe/dataspeech}}
}
```
```
@misc{lyth2024natural,
title={Natural language guidance of high-fidelity text-to-speech with synthetic annotations},
author={Dan Lyth and Simon King},
year={2024},
eprint={2402.01912},
archivePrefix={arXiv},
primaryClass={cs.SD}
}
``` |
Jasshl/living_room | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 21346977.0
num_examples: 254
download_size: 19917130
dataset_size: 21346977.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
olm/wikipedia | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
pretty_name: Wikipedia
paperswithcode_id: null
license:
- cc-by-sa-3.0
- gfdl
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
source_datasets:
- original
multilinguality:
- multilingual
size_categories:
- n<1K
- 1K<n<10K
- 10K<n<100K
- 100K<n<1M
- 1M<n<10M
language:
- aa
- ab
- ace
- af
- ak
- als
- am
- an
- ang
- ar
- arc
- arz
- as
- ast
- atj
- av
- ay
- az
- azb
- ba
- bar
- bcl
- be
- bg
- bh
- bi
- bjn
- bm
- bn
- bo
- bpy
- br
- bs
- bug
- bxr
- ca
- cbk
- cdo
- ce
- ceb
- ch
- cho
- chr
- chy
- ckb
- co
- cr
- crh
- cs
- csb
- cu
- cv
- cy
- da
- de
- din
- diq
- dsb
- dty
- dv
- dz
- ee
- el
- eml
- en
- eo
- es
- et
- eu
- ext
- fa
- ff
- fi
- fj
- fo
- fr
- frp
- frr
- fur
- fy
- ga
- gag
- gan
- gd
- gl
- glk
- gn
- gom
- gor
- got
- gu
- gv
- ha
- hak
- haw
- he
- hi
- hif
- ho
- hr
- hsb
- ht
- hu
- hy
- ia
- id
- ie
- ig
- ii
- ik
- ilo
- inh
- io
- is
- it
- iu
- ja
- jam
- jbo
- jv
- ka
- kaa
- kab
- kbd
- kbp
- kg
- ki
- kj
- kk
- kl
- km
- kn
- ko
- koi
- krc
- ks
- ksh
- ku
- kv
- kw
- ky
- la
- lad
- lb
- lbe
- lez
- lfn
- lg
- li
- lij
- lmo
- ln
- lo
- lrc
- lt
- ltg
- lv
- lzh
- mai
- mdf
- mg
- mh
- mhr
- mi
- min
- mk
- ml
- mn
- mr
- mrj
- ms
- mt
- mus
- mwl
- my
- myv
- mzn
- na
- nah
- nan
- nap
- nds
- ne
- new
- ng
- nl
- nn
- 'no'
- nov
- nrf
- nso
- nv
- ny
- oc
- olo
- om
- or
- os
- pa
- pag
- pam
- pap
- pcd
- pdc
- pfl
- pi
- pih
- pl
- pms
- pnb
- pnt
- ps
- pt
- qu
- rm
- rmy
- rn
- ro
- ru
- rue
- rup
- rw
- sa
- sah
- sat
- sc
- scn
- sco
- sd
- se
- sg
- sgs
- sh
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- srn
- ss
- st
- stq
- su
- sv
- sw
- szl
- ta
- tcy
- tdt
- te
- tg
- th
- ti
- tk
- tl
- tn
- to
- tpi
- tr
- ts
- tt
- tum
- tw
- ty
- tyv
- udm
- ug
- uk
- ur
- uz
- ve
- vec
- vep
- vi
- vls
- vo
- vro
- wa
- war
- wo
- wuu
- xal
- xh
- xmf
- yi
- yo
- yue
- za
- zea
- zh
- zu
language_bcp47:
- nds-nl
config_names:
- 20220301.aa
- 20220301.ab
- 20220301.ace
- 20220301.ady
- 20220301.af
- 20220301.ak
- 20220301.als
- 20220301.am
- 20220301.an
- 20220301.ang
- 20220301.ar
- 20220301.arc
- 20220301.arz
- 20220301.as
- 20220301.ast
- 20220301.atj
- 20220301.av
- 20220301.ay
- 20220301.az
- 20220301.azb
- 20220301.ba
- 20220301.bar
- 20220301.bat-smg
- 20220301.bcl
- 20220301.be
- 20220301.be-x-old
- 20220301.bg
- 20220301.bh
- 20220301.bi
- 20220301.bjn
- 20220301.bm
- 20220301.bn
- 20220301.bo
- 20220301.bpy
- 20220301.br
- 20220301.bs
- 20220301.bug
- 20220301.bxr
- 20220301.ca
- 20220301.cbk-zam
- 20220301.cdo
- 20220301.ce
- 20220301.ceb
- 20220301.ch
- 20220301.cho
- 20220301.chr
- 20220301.chy
- 20220301.ckb
- 20220301.co
- 20220301.cr
- 20220301.crh
- 20220301.cs
- 20220301.csb
- 20220301.cu
- 20220301.cv
- 20220301.cy
- 20220301.da
- 20220301.de
- 20220301.din
- 20220301.diq
- 20220301.dsb
- 20220301.dty
- 20220301.dv
- 20220301.dz
- 20220301.ee
- 20220301.el
- 20220301.eml
- 20220301.en
- 20220301.eo
- 20220301.es
- 20220301.et
- 20220301.eu
- 20220301.ext
- 20220301.fa
- 20220301.ff
- 20220301.fi
- 20220301.fiu-vro
- 20220301.fj
- 20220301.fo
- 20220301.fr
- 20220301.frp
- 20220301.frr
- 20220301.fur
- 20220301.fy
- 20220301.ga
- 20220301.gag
- 20220301.gan
- 20220301.gd
- 20220301.gl
- 20220301.glk
- 20220301.gn
- 20220301.gom
- 20220301.gor
- 20220301.got
- 20220301.gu
- 20220301.gv
- 20220301.ha
- 20220301.hak
- 20220301.haw
- 20220301.he
- 20220301.hi
- 20220301.hif
- 20220301.ho
- 20220301.hr
- 20220301.hsb
- 20220301.ht
- 20220301.hu
- 20220301.hy
- 20220301.ia
- 20220301.id
- 20220301.ie
- 20220301.ig
- 20220301.ii
- 20220301.ik
- 20220301.ilo
- 20220301.inh
- 20220301.io
- 20220301.is
- 20220301.it
- 20220301.iu
- 20220301.ja
- 20220301.jam
- 20220301.jbo
- 20220301.jv
- 20220301.ka
- 20220301.kaa
- 20220301.kab
- 20220301.kbd
- 20220301.kbp
- 20220301.kg
- 20220301.ki
- 20220301.kj
- 20220301.kk
- 20220301.kl
- 20220301.km
- 20220301.kn
- 20220301.ko
- 20220301.koi
- 20220301.krc
- 20220301.ks
- 20220301.ksh
- 20220301.ku
- 20220301.kv
- 20220301.kw
- 20220301.ky
- 20220301.la
- 20220301.lad
- 20220301.lb
- 20220301.lbe
- 20220301.lez
- 20220301.lfn
- 20220301.lg
- 20220301.li
- 20220301.lij
- 20220301.lmo
- 20220301.ln
- 20220301.lo
- 20220301.lrc
- 20220301.lt
- 20220301.ltg
- 20220301.lv
- 20220301.mai
- 20220301.map-bms
- 20220301.mdf
- 20220301.mg
- 20220301.mh
- 20220301.mhr
- 20220301.mi
- 20220301.min
- 20220301.mk
- 20220301.ml
- 20220301.mn
- 20220301.mr
- 20220301.mrj
- 20220301.ms
- 20220301.mt
- 20220301.mus
- 20220301.mwl
- 20220301.my
- 20220301.myv
- 20220301.mzn
- 20220301.na
- 20220301.nah
- 20220301.nap
- 20220301.nds
- 20220301.nds-nl
- 20220301.ne
- 20220301.new
- 20220301.ng
- 20220301.nl
- 20220301.nn
- 20220301.no
- 20220301.nov
- 20220301.nrm
- 20220301.nso
- 20220301.nv
- 20220301.ny
- 20220301.oc
- 20220301.olo
- 20220301.om
- 20220301.or
- 20220301.os
- 20220301.pa
- 20220301.pag
- 20220301.pam
- 20220301.pap
- 20220301.pcd
- 20220301.pdc
- 20220301.pfl
- 20220301.pi
- 20220301.pih
- 20220301.pl
- 20220301.pms
- 20220301.pnb
- 20220301.pnt
- 20220301.ps
- 20220301.pt
- 20220301.qu
- 20220301.rm
- 20220301.rmy
- 20220301.rn
- 20220301.ro
- 20220301.roa-rup
- 20220301.roa-tara
- 20220301.ru
- 20220301.rue
- 20220301.rw
- 20220301.sa
- 20220301.sah
- 20220301.sat
- 20220301.sc
- 20220301.scn
- 20220301.sco
- 20220301.sd
- 20220301.se
- 20220301.sg
- 20220301.sh
- 20220301.si
- 20220301.simple
- 20220301.sk
- 20220301.sl
- 20220301.sm
- 20220301.sn
- 20220301.so
- 20220301.sq
- 20220301.sr
- 20220301.srn
- 20220301.ss
- 20220301.st
- 20220301.stq
- 20220301.su
- 20220301.sv
- 20220301.sw
- 20220301.szl
- 20220301.ta
- 20220301.tcy
- 20220301.te
- 20220301.tet
- 20220301.tg
- 20220301.th
- 20220301.ti
- 20220301.tk
- 20220301.tl
- 20220301.tn
- 20220301.to
- 20220301.tpi
- 20220301.tr
- 20220301.ts
- 20220301.tt
- 20220301.tum
- 20220301.tw
- 20220301.ty
- 20220301.tyv
- 20220301.udm
- 20220301.ug
- 20220301.uk
- 20220301.ur
- 20220301.uz
- 20220301.ve
- 20220301.vec
- 20220301.vep
- 20220301.vi
- 20220301.vls
- 20220301.vo
- 20220301.wa
- 20220301.war
- 20220301.wo
- 20220301.wuu
- 20220301.xal
- 20220301.xh
- 20220301.xmf
- 20220301.yi
- 20220301.yo
- 20220301.za
- 20220301.zea
- 20220301.zh
- 20220301.zh-classical
- 20220301.zh-min-nan
- 20220301.zh-yue
- 20220301.zu
---
# Dataset Card for Wikipedia
This repo is a fork of the original Hugging Face Wikipedia repo [here](https://huggingface.co/datasets/wikipedia).
The difference is that this fork does away with the need for `apache-beam`, and this fork is very fast if you have a lot of CPUs on your machine.
It will use all CPUs available to create a clean Wikipedia pretraining dataset. It takes less than an hour to process all of English wikipedia on a GCP n1-standard-96.
This fork is also used in the [OLM Project](https://github.com/huggingface/olm-datasets) to pull and process up-to-date wikipedia snapshots.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
Wikipedia dataset containing cleaned articles of all languages.
The datasets are built from the Wikipedia dump
(https://dumps.wikimedia.org/) with one split per language. Each example
contains the content of one full Wikipedia article with cleaning to strip
markdown and unwanted sections (references, etc.).
The articles are parsed using the ``mwparserfromhell`` tool, and we use ``multiprocess`` for parallelization.
To load this dataset you need to install these first:
```
pip install mwparserfromhell==0.6.4 multiprocess==0.70.13
```
Then, you can load any subset of Wikipedia per language and per date this way:
```python
from datasets import load_dataset
load_dataset("olm/wikipedia", language="en", date="20220920")
```
You can find the full list of languages and dates [here](https://dumps.wikimedia.org/backup-index.html).
### Supported Tasks and Leaderboards
The dataset is generally used for Language Modeling.
### Languages
You can find the list of languages [here](https://meta.wikimedia.org/wiki/List_of_Wikipedias).
## Dataset Structure
### Data Instances
An example looks as follows:
```
{'id': '1',
'url': 'https://simple.wikipedia.org/wiki/April',
'title': 'April',
'text': 'April is the fourth month...'
}
```
### Data Fields
The data fields are the same among all configurations:
- `id` (`str`): ID of the article.
- `url` (`str`): URL of the article.
- `title` (`str`): Title of the article.
- `text` (`str`): Text content of the article.
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
Most of Wikipedia's text and many of its images are co-licensed under the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License)
(CC BY-SA) and the [GNU Free Documentation License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_GNU_Free_Documentation_License)
(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts).
Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such
text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes
the text.
### Citation Information
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
```
|
ibrohim8828/Fiqih | ---
license: unknown
---
|
caffeinism/tiny-pajama-parquet | ---
task_categories:
- text-generation
language:
- en
---
# TinyPajama
Subset of [SlimPajama](https://huggingface.co/datasets/cerebras/SlimPajama-627B)
Approximately 10B tokens |
Priyesh00/SherlockEval | ---
license: mit
task_categories:
- text2text-generation
language:
- en
- hi
tags:
- art
size_categories:
- n<1K
---
Its quite clear to me that needle in a haystack tests are quite broken with every model paper showing like a 99.99% coverage on over 100k+ tokens. When in practice, it can be seen that anything over 20% of total model context size is useless for RAG via input tokens.
So the goal here is to expand this dataset to be the most comprehensive stack of mystery novels and have the LLM predict the final killer as proposed by [Sholto Douglas](https://www.linkedin.com/in/sholto) on [this podcast](https://www.youtube.com/watch?v=UTuuTTnjxMQ&pp=ygUOc2hvbHRvIGRvdWdsYXM%3D) with Dwarkesh.
> Obviously the missing thing here is that the pre-training data almost 100% covers Sherlock Holmes novels. The most immediate fix I found here was to replace all english names with Hindi or any other regional language to increase token size drastically for the model as these words require more tokens in the vocabulary to embed.
Since, I keep running out of API calls to replace names in the text, I have attached a list of 500 names that I could cleanly generate in the [recipe here](https://gist.github.com/never2average/57f358799c0ae2a40254df365b17f13a).
My goal for v2 of this dataset is to just take more of these novels and convert it into an assignment task for detectives. And try to check for sample efficiency to get the LLM to be as good as Sherlock Holmes. |
Back-up/sotaysv-qa | ---
dataset_info:
features:
- name: Questions
dtype: string
- name: Answers
dtype: string
splits:
- name: train
num_bytes: 129518
num_examples: 176
download_size: 56231
dataset_size: 129518
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sotaysv-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
unum-cloud/ann-cc-3m | ---
license: apache-2.0
---
# CC3M Image-Text Embeddings
- `images_part{1-3}.txt` are text files with base64-encoded images.
- `texts.txt` is a text file with captions for images.
- `images.{model_name}.fbin` is a binary file with {model_name} image embeddings.
- `images.{model_name}.usearch` is a binary file with a serialized USearch image index which contains `images.{model_name}.fbin`.
- `texts.{model_name}.fbin` is a binary file with {model_name} text embeddings.
- `texts.{model_name}.usearch` is a binary file with a serialized USearch text index which contains `texts.{model_name}.fbin`.
|
micsell/hebrew_keywords1 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': daateh
'1': hait
'2': higat
'3': hona
'4': itah
'5': lah
'6': otah
'7': shelah
splits:
- name: train
num_bytes: 31020219.723
num_examples: 2581
download_size: 32065400
dataset_size: 31020219.723
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
KentoTsu/hinny | ---
license: openrail
---
|
nairaxo/shikomori-sentiment | ---
dataset_info:
features:
- name: Text
dtype: string
- name: Polarity
dtype: float64
- name: Sentiment
dtype: string
- name: Dialect
dtype: string
- name: Source
dtype: string
- name: Type
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2275872
num_examples: 17419
download_size: 1041646
dataset_size: 2275872
---
# Dataset Card for "shikomori-sentiment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/FINCH_TRAIN_SA_ESG_100 | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: 'null'
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 912858
num_examples: 100
download_size: 503144
dataset_size: 912858
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/grecale_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of grecale (Kantai Collection)
This is the dataset of grecale (Kantai Collection), containing 350 images and their tags.
The core tags of this character are `blonde_hair, long_hair, bow, green_eyes, hair_bow, hair_between_eyes, pink_bow, wavy_hair, ribbon, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 350 | 332.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grecale_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 350 | 218.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grecale_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 799 | 466.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grecale_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 350 | 304.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grecale_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 799 | 607.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grecale_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/grecale_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1girl, sailor_dress, sleeveless_dress, solo, white_dress, white_sailor_collar, neckerchief, looking_at_viewer, striped, simple_background, white_background, hat, hair_ribbon, smile, tongue_out, blush |
| 1 | 21 |  |  |  |  |  | blush, demon_horns, 1girl, fake_horns, demon_wings, solo, smile, demon_tail, halloween_costume, clothes_writing, collarbone, single_thighhigh, black_skirt, black_thighhighs, italian_text, simple_background, lollipop, dress, holding, looking_at_viewer, sleeveless, tongue_out, white_background |
| 2 | 45 |  |  |  |  |  | 1girl, solo, hair_flower, looking_at_viewer, side_ponytail, blush, necklace, smile, navel, pink_flower, denim_shorts, blue_shorts, flat_chest, collarbone, open_mouth, short_shorts, blue_bikini, multicolored_bikini, simple_background, white_background |
| 3 | 7 |  |  |  |  |  | 1girl, black_skirt, blush, clothes_writing, long_sleeves, pink_shirt, smile, solo, collarbone, gift_box, simple_background, valentine, black_thighhighs, holding_gift, looking_at_viewer, white_background, hair_ornament, heart_print, pleated_skirt, heart_necklace, open_mouth, polka_dot_skirt, tongue_out, twitter_username, white_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | sailor_dress | sleeveless_dress | solo | white_dress | white_sailor_collar | neckerchief | looking_at_viewer | striped | simple_background | white_background | hat | hair_ribbon | smile | tongue_out | blush | demon_horns | fake_horns | demon_wings | demon_tail | halloween_costume | clothes_writing | collarbone | single_thighhigh | black_skirt | black_thighhighs | italian_text | lollipop | dress | holding | sleeveless | hair_flower | side_ponytail | necklace | navel | pink_flower | denim_shorts | blue_shorts | flat_chest | open_mouth | short_shorts | blue_bikini | multicolored_bikini | long_sleeves | pink_shirt | gift_box | valentine | holding_gift | hair_ornament | heart_print | pleated_skirt | heart_necklace | polka_dot_skirt | twitter_username | white_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------------------|:-------|:--------------|:----------------------|:--------------|:--------------------|:----------|:--------------------|:-------------------|:------|:--------------|:--------|:-------------|:--------|:--------------|:-------------|:--------------|:-------------|:--------------------|:------------------|:-------------|:-------------------|:--------------|:-------------------|:---------------|:-----------|:--------|:----------|:-------------|:--------------|:----------------|:-----------|:--------|:--------------|:---------------|:--------------|:-------------|:-------------|:---------------|:--------------|:----------------------|:---------------|:-------------|:-----------|:------------|:---------------|:----------------|:--------------|:----------------|:-----------------|:------------------|:-------------------|:---------------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 21 |  |  |  |  |  | X | | | X | | | | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 45 |  |  |  |  |  | X | | | X | | | | X | | X | X | | | X | | X | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | X | | | | X | | X | X | | | X | X | X | | | | | | X | X | | X | X | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
Alan318/2 | ---
license: mit
---
|
Nexdata/159_Hours_Uyghur_Conversational_Speech_Data_by_Mobile_Phone | ---
license: cc-by-nc-nd-4.0
---
## Description
Uyghur(China) Spontaneous Dialogue Smartphone speech dataset, collected from dialogues based on given topics, covering 20+ domains. Transcribed with text content, speaker's ID, gender, age and other attributes. Our dataset was collected from extensive and diversify speakers(328 native speakers), geographicly speaking, enhancing model performance in real and complex tasks. Quality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1176?source=Huggingface
## Format
16kHz, 16 bit, wav, mono channel;
## Content category
Dialogue based on given topics;
## Recording condition
Low background noise (indoor);
## Recording device
Android smartphone, iPhone;
## Speaker
328 native speakers in total, 37% male and 63% female;
## Country
China(CHN);
## Language(Region) Code
ug-CN;
## Language
Uyghur;
## Features of annotation
Transcription text, timestamp, speaker ID, gender, noise, PII redacted.
## Accuracy Rate
Sentence Accuracy Rate (SAR) 95%
# Licensing Information
Commercial License
|
open-llm-leaderboard/details_mosaicml__mpt-7b-chat | ---
pretty_name: Evaluation run of mosaicml/mpt-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mosaicml/mpt-7b-chat](https://huggingface.co/mosaicml/mpt-7b-chat) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T09:38:22.163645](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-chat/blob/main/results_2023-10-17T09-38-22.163645.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06952600671140939,\n\
\ \"em_stderr\": 0.002604746204517829,\n \"f1\": 0.12196937919463072,\n\
\ \"f1_stderr\": 0.002840521979064293,\n \"acc\": 0.3626168565432783,\n\
\ \"acc_stderr\": 0.009260585769647573\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.06952600671140939,\n \"em_stderr\": 0.002604746204517829,\n\
\ \"f1\": 0.12196937919463072,\n \"f1_stderr\": 0.002840521979064293\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04094010614101592,\n \
\ \"acc_stderr\": 0.005458076796294338\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6842936069455406,\n \"acc_stderr\": 0.01306309474300081\n\
\ }\n}\n```"
repo_url: https://huggingface.co/mosaicml/mpt-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|arc:challenge|25_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T09_38_22.163645
path:
- '**/details_harness|drop|3_2023-10-17T09-38-22.163645.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T09-38-22.163645.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T09_38_22.163645
path:
- '**/details_harness|gsm8k|5_2023-10-17T09-38-22.163645.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T09-38-22.163645.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hellaswag|10_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:41.356813.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-20T10:00:41.356813.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-20T10:00:41.356813.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T09_38_22.163645
path:
- '**/details_harness|winogrande|5_2023-10-17T09-38-22.163645.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T09-38-22.163645.parquet'
- config_name: results
data_files:
- split: 2023_07_20T10_00_41.356813
path:
- results_2023-07-20T10:00:41.356813.parquet
- split: 2023_10_17T09_38_22.163645
path:
- results_2023-10-17T09-38-22.163645.parquet
- split: latest
path:
- results_2023-10-17T09-38-22.163645.parquet
---
# Dataset Card for Evaluation run of mosaicml/mpt-7b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mosaicml/mpt-7b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-chat](https://huggingface.co/mosaicml/mpt-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T09:38:22.163645](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-chat/blob/main/results_2023-10-17T09-38-22.163645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06952600671140939,
"em_stderr": 0.002604746204517829,
"f1": 0.12196937919463072,
"f1_stderr": 0.002840521979064293,
"acc": 0.3626168565432783,
"acc_stderr": 0.009260585769647573
},
"harness|drop|3": {
"em": 0.06952600671140939,
"em_stderr": 0.002604746204517829,
"f1": 0.12196937919463072,
"f1_stderr": 0.002840521979064293
},
"harness|gsm8k|5": {
"acc": 0.04094010614101592,
"acc_stderr": 0.005458076796294338
},
"harness|winogrande|5": {
"acc": 0.6842936069455406,
"acc_stderr": 0.01306309474300081
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/07a40269 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1334
dataset_size: 186
---
# Dataset Card for "07a40269"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gabtan99/pex-conversations | ---
language:
- tl
- fil
license:
- unknown
multilinguality:
- multilingual
size_categories:
- unknown
source_datasets:
- original
task_categories:
- sequence-modeling
task_ids:
- dialogue-modeling
- language-modeling
pretty_name: PEx Conversations
tags:
- multi-turn
---
# PinoyExchange (PEx) Conversations Dataset
# Summary
PEx Conversations is a dataset composed of collected threads from PinoyExchange.com (Consisting of Tagalog, English, or Taglish responses).
The corpus consists of 45K total scraped threads from 8 subforums. The data only consists of the user message which means any images, videos, links, or any embdedded html are not collected in the scraping process. All characters have been transliterated to its closest ASCII representation, and unicode errors were fixed.
# Format
The data is categorized per category. The objects in the list is composed of:
* category - the category of the threads
* conversations - the list of threads
The threads inside conversations have recursive structure consisting of the following:
* text - This is the response/reply/prompt
* replies - This is a list of the replies to this prompt. The replies inside the list has a structure with the same text and replies component.
# Subforum percentages
The amount of data per subforum are as follows:
* Small Talk - 5K conversations with 1.16M utterances
* Food & Drinks - 8.2K conversations with 273K utterances
* Health & Wellness - 6.3K conversations with 93K utterances
* Body & Fitness - 3.9K conversations with 94K utterances
* Home & Garden - 3.6K conversations with 71K utterances
* Style & Fashion - 9.7K conversations with 197K utterances
* Travel & Leisure - 7.3K conversations with 431K utterances
* Visas & Immigration - 1.1K conversations with 99K utterances
# Model Research
[Tagalog DialoGPT](https://huggingface.co/gabtan99/dialogpt-tagalog-medium) |
System36/dylan_beta | ---
license: cc-by-nc-sa-2.0
---
|
Saturo1234567/Gojo22 | ---
license: openrail
---
|
tr416/dataset_20231007_032211 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73870
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231007_032211"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/07c1bf52 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1340
dataset_size: 178
---
# Dataset Card for "07c1bf52"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WillHeld/JamPatoisNLI | ---
dataset_info:
features:
- name: Number
dtype: int64
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 32336
num_examples: 250
- name: val
num_bytes: 27515
num_examples: 200
- name: test
num_bytes: 27342
num_examples: 200
download_size: 67207
dataset_size: 87193
---
This data comes from "JamPatoisNLI: A Jamaican Patois Natural Language Inference Dataset" by Ruth-Ann Armstrong, John Hewitt, Christopher Manning. Please cite the original work if you make use of this data:
```
@article{DBLP:journals/corr/abs-2212-03419,
author = {Ruth{-}Ann Armstrong and
John Hewitt and
Christopher D. Manning},
title = {JamPatoisNLI: {A} Jamaican Patois Natural Language Inference Dataset},
journal = {CoRR},
volume = {abs/2212.03419},
year = {2022},
url = {https://doi.org/10.48550/arXiv.2212.03419},
doi = {10.48550/arXiv.2212.03419},
eprinttype = {arXiv},
eprint = {2212.03419},
timestamp = {Mon, 02 Jan 2023 15:09:55 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2212-03419.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
Santosh-Gupta/EncephalitisQueryDocuments | ---
license: mit
---
This is an Encephalitis Title, Abstracts, and Search Queries Dataset
This dataset contains pairs of encephalitis title/abstracts and related search queries. The Title is first sentence in column called Title_Abstract.
The search queries are highly relevant to the abstracts, but use different keywords and phrasing. As a result, the abstracts may not appear in search results when using these queries with traditional search engines.
The goal of this dataset is to train models to retrieve relevant documents for searches that may be overlooked by conventional term-matching approaches.
The dataset was created in August 2023 and contains 53,146 encephalitis abstracts collected using the Biopython library. GPT-3.5 was then used to generate multiple search queries for each abstract that are relevant to topics discussed in the abstract, but the wording/phrasing is different from what is contained in the abstract.
By training on this data, more semantic models could better connect user search queries with relevant content. This has the potential to improve search recall for specialized domains like medical literature.
To open
```
import pandas as pd
df = pd.read_parquet('raw_training_df.parquet')
``` |
lhoestq/pdfa_cc_main_2021_31_pdf_untruncated | ---
pretty_name: CC-MAIN-2021-31-PDF-UNTRUNCATED corpus
tags:
- ocr
- pdf
---
# Dataset card for the CC-MAIN-2021-31-PDF-UNTRUNCATED corpus
- **Homepage:** [https://pdfa.org/new-large-scale-pdf-corpus-now-publicly-available](https://pdfa.org/new-large-scale-pdf-corpus-now-publicly-available) |
SniiKz/llama2_Chat_trainingsetv3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1874353
num_examples: 2645
download_size: 278443
dataset_size: 1874353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2_Chat_trainingsetv3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_bigcode__starcoderbase-7b | ---
pretty_name: Evaluation run of bigcode/starcoderbase-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bigcode/starcoderbase-7b](https://huggingface.co/bigcode/starcoderbase-7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__starcoderbase-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T22:30:37.851656](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-7b/blob/main/results_2024-02-14T22-30-37.851656.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2855544717164793,\n\
\ \"acc_stderr\": 0.032025544877512004,\n \"acc_norm\": 0.28731800624157283,\n\
\ \"acc_norm_stderr\": 0.03278279025567369,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707696,\n \"mc2\": 0.4046263361255611,\n\
\ \"mc2_stderr\": 0.014888506723649383\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2508532423208191,\n \"acc_stderr\": 0.012668198621315435,\n\
\ \"acc_norm\": 0.2986348122866894,\n \"acc_norm_stderr\": 0.013374078615068756\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3551085441147182,\n\
\ \"acc_stderr\": 0.004775681871529863,\n \"acc_norm\": 0.4386576379207329,\n\
\ \"acc_norm_stderr\": 0.004952087083128893\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03459777606810537,\n\
\ \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03459777606810537\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.33962264150943394,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.33962264150943394,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3194444444444444,\n\
\ \"acc_stderr\": 0.03899073687357336,\n \"acc_norm\": 0.3194444444444444,\n\
\ \"acc_norm_stderr\": 0.03899073687357336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n\
\ \"acc_stderr\": 0.02995785132986934,\n \"acc_norm\": 0.1907514450867052,\n\
\ \"acc_norm_stderr\": 0.02995785132986934\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893596,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n\
\ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n\
\ \"acc_stderr\": 0.024251071262208834,\n \"acc_norm\": 0.23870967741935484,\n\
\ \"acc_norm_stderr\": 0.024251071262208834\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03010833071801162,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03010833071801162\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03191178226713549,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03191178226713549\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041154,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.02199201666237056,\n \
\ \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.02199201666237056\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514567,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514567\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.028657491285071973,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.028657491285071973\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27339449541284405,\n \"acc_stderr\": 0.019109299846098278,\n \"\
acc_norm\": 0.27339449541284405,\n \"acc_norm_stderr\": 0.019109299846098278\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.028353212866863434,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.028353212866863434\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3137254901960784,\n \"acc_stderr\": 0.03256685484460388,\n \"\
acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.03256685484460388\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.38565022421524664,\n\
\ \"acc_stderr\": 0.03266842214289202,\n \"acc_norm\": 0.38565022421524664,\n\
\ \"acc_norm_stderr\": 0.03266842214289202\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.39814814814814814,\n\
\ \"acc_stderr\": 0.047323326159788154,\n \"acc_norm\": 0.39814814814814814,\n\
\ \"acc_norm_stderr\": 0.047323326159788154\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.36752136752136755,\n\
\ \"acc_stderr\": 0.03158539157745636,\n \"acc_norm\": 0.36752136752136755,\n\
\ \"acc_norm_stderr\": 0.03158539157745636\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3052362707535121,\n\
\ \"acc_stderr\": 0.016467711947635123,\n \"acc_norm\": 0.3052362707535121,\n\
\ \"acc_norm_stderr\": 0.016467711947635123\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.024752411960917212,\n\
\ \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.024752411960917212\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961459,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961459\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3086816720257235,\n\
\ \"acc_stderr\": 0.026236965881153256,\n \"acc_norm\": 0.3086816720257235,\n\
\ \"acc_norm_stderr\": 0.026236965881153256\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.02508947852376513,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.02508947852376513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30851063829787234,\n \"acc_stderr\": 0.027553366165101362,\n \
\ \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.027553366165101362\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2796610169491525,\n\
\ \"acc_stderr\": 0.011463397393861974,\n \"acc_norm\": 0.2796610169491525,\n\
\ \"acc_norm_stderr\": 0.011463397393861974\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23161764705882354,\n \"acc_stderr\": 0.025626533803777565,\n\
\ \"acc_norm\": 0.23161764705882354,\n \"acc_norm_stderr\": 0.025626533803777565\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.04309118709946459,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.04309118709946459\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.33877551020408164,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.33877551020408164,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31840796019900497,\n\
\ \"acc_stderr\": 0.03294118479054095,\n \"acc_norm\": 0.31840796019900497,\n\
\ \"acc_norm_stderr\": 0.03294118479054095\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824564,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707696,\n \"mc2\": 0.4046263361255611,\n\
\ \"mc2_stderr\": 0.014888506723649383\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5438042620363063,\n \"acc_stderr\": 0.013998453610924324\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05458680818802123,\n \
\ \"acc_stderr\": 0.006257444037912551\n }\n}\n```"
repo_url: https://huggingface.co/bigcode/starcoderbase-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|arc:challenge|25_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|gsm8k|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hellaswag|10_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T22-30-37.851656.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T22-30-37.851656.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- '**/details_harness|winogrande|5_2024-02-14T22-30-37.851656.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T22-30-37.851656.parquet'
- config_name: results
data_files:
- split: 2024_02_14T22_30_37.851656
path:
- results_2024-02-14T22-30-37.851656.parquet
- split: latest
path:
- results_2024-02-14T22-30-37.851656.parquet
---
# Dataset Card for Evaluation run of bigcode/starcoderbase-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bigcode/starcoderbase-7b](https://huggingface.co/bigcode/starcoderbase-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__starcoderbase-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T22:30:37.851656](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-7b/blob/main/results_2024-02-14T22-30-37.851656.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2855544717164793,
"acc_stderr": 0.032025544877512004,
"acc_norm": 0.28731800624157283,
"acc_norm_stderr": 0.03278279025567369,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707696,
"mc2": 0.4046263361255611,
"mc2_stderr": 0.014888506723649383
},
"harness|arc:challenge|25": {
"acc": 0.2508532423208191,
"acc_stderr": 0.012668198621315435,
"acc_norm": 0.2986348122866894,
"acc_norm_stderr": 0.013374078615068756
},
"harness|hellaswag|10": {
"acc": 0.3551085441147182,
"acc_stderr": 0.004775681871529863,
"acc_norm": 0.4386576379207329,
"acc_norm_stderr": 0.004952087083128893
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03459777606810537,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03459777606810537
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33962264150943394,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.33962264150943394,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.03899073687357336,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.03899073687357336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.02995785132986934,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.02995785132986934
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893596,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208834,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208834
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03010833071801162,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03010833071801162
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03191178226713549,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03191178226713549
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041154,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2512820512820513,
"acc_stderr": 0.02199201666237056,
"acc_norm": 0.2512820512820513,
"acc_norm_stderr": 0.02199201666237056
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514567,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514567
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.028657491285071973,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.028657491285071973
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27339449541284405,
"acc_stderr": 0.019109299846098278,
"acc_norm": 0.27339449541284405,
"acc_norm_stderr": 0.019109299846098278
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.028353212866863434,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.028353212866863434
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.03256685484460388,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.03256685484460388
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.38565022421524664,
"acc_stderr": 0.03266842214289202,
"acc_norm": 0.38565022421524664,
"acc_norm_stderr": 0.03266842214289202
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.047323326159788154,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.047323326159788154
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.36752136752136755,
"acc_stderr": 0.03158539157745636,
"acc_norm": 0.36752136752136755,
"acc_norm_stderr": 0.03158539157745636
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3052362707535121,
"acc_stderr": 0.016467711947635123,
"acc_norm": 0.3052362707535121,
"acc_norm_stderr": 0.016467711947635123
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.024752411960917212,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.024752411960917212
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961459,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3086816720257235,
"acc_stderr": 0.026236965881153256,
"acc_norm": 0.3086816720257235,
"acc_norm_stderr": 0.026236965881153256
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30851063829787234,
"acc_stderr": 0.027553366165101362,
"acc_norm": 0.30851063829787234,
"acc_norm_stderr": 0.027553366165101362
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2796610169491525,
"acc_stderr": 0.011463397393861974,
"acc_norm": 0.2796610169491525,
"acc_norm_stderr": 0.011463397393861974
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23161764705882354,
"acc_stderr": 0.025626533803777565,
"acc_norm": 0.23161764705882354,
"acc_norm_stderr": 0.025626533803777565
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.04309118709946459,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.04309118709946459
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.33877551020408164,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.33877551020408164,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31840796019900497,
"acc_stderr": 0.03294118479054095,
"acc_norm": 0.31840796019900497,
"acc_norm_stderr": 0.03294118479054095
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947861,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947861
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824564,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707696,
"mc2": 0.4046263361255611,
"mc2_stderr": 0.014888506723649383
},
"harness|winogrande|5": {
"acc": 0.5438042620363063,
"acc_stderr": 0.013998453610924324
},
"harness|gsm8k|5": {
"acc": 0.05458680818802123,
"acc_stderr": 0.006257444037912551
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Datasaur/Mongabay-collection | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: tags
dtype: string
splits:
- name: tags_train
num_bytes: 7618915
num_examples: 3919
- name: tags_validation
num_bytes: 883992
num_examples: 492
- name: tags_test
num_bytes: 871093
num_examples: 485
- name: sentiment_train
num_bytes: 7208247
num_examples: 3919
- name: sentiment_validation
num_bytes: 874803
num_examples: 492
- name: sentiment_test
num_bytes: 861831
num_examples: 485
download_size: 9550225
dataset_size: 18318881
---
# Dataset Card for "Mongabay-collection"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_mmlu_tr_f5 | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 137404.0
num_examples: 250
download_size: 0
dataset_size: 137404.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_tr_f5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OdiaGenAI/hardcode_odia_qa_105 | ---
license: cc-by-nc-4.0
---
|
mzschwartz88/seg5 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 32989798.0
num_examples: 800
- name: validation
num_bytes: 8442415.0
num_examples: 200
download_size: 41361094
dataset_size: 41432213.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
daniel-dona/tfg-voice-2 | ---
license: cc-by-sa-3.0
---
|
GalacticV/Aria_3 | ---
license: openrail
---
|
furry-br/stella | ---
license: openrail
---
|
mozilla-foundation/youtube_regrets | ---
license: cc0-1.0
---
# Dataset Card for Mozilla RegretsReporter Public Data
## Dataset Description
- **Homepage: https://foundation.mozilla.org/en/youtube/**
- **Repository: https://github.com/mozilla-extensions/regrets-reporter**
- **Paper: https://foundation.mozilla.org/en/youtube/user-controls/**
- **Point of Contact: publicdata@mozillafoundation.org**
### Dataset Summary
#### RegretsReporter Data
This data set card describes the public data sets made available based on [Mozilla’s RegretsReporter
research](https://foundation.mozilla.org/en/youtube/) as well as the [Viu Política research](https://en.vero.org.br/projetos/viu-politica)
from the University of Exeter and Vero Instituto.
This data was collected from participants in Mozilla’s RegretsReporter studies.
Participants installed a web extension to participate in each study. In the case of the
first study, data was collected from all participants that installed the extension. In the
second, data was only collected from participants that positively opted in to experiment participation
after installing the extension.
#### Viu Politica Data
This data was collected from the Viu Política research project. Data was only collected from participants that positively opted in to experiment participation after installing the extension.
The videos included are predominantly in Portuguese. The extension UX was available in Portuguese only, and promotional materials about the study were released in Brazilian Portuguese, for a Brazilian audience.
For this study, data includes “tagged videos”, “recommendations”, and “collected videos”:
- Tagged videos are videos that study participants considered to contain political content. They are videos for which the participant pressed the “Viu Politica?” (“See Politics?”, loosely translated) button overlayed on the video player or thumbnail.
- Recommendations are videos recommended to our participants by YouTube, on the sidebar of the video player page, when the participant is watching a video they tagged as political. Only videos loaded in the browser are included, so the “infinite scroll” of the sidebar is only included to the extent that it is loaded, which will depend on participant scrolling behavior.
- Collected videos are videos manually identified on YouTube using a selection of keywords relevant to the 2022 Brazilian elections. These videos were used in the research project to support the two datasets above, and provide further coverage of the videos posted on YouTube during the electoral period.
- The full transcripts of all tagged, recommended, and collected videos are also included, as a separate dataset.
### Languages
The RegretsReporter videos included are predominantly English, but include
over 100 different languages, based on our automatic classification.
The extension UX was available in English only, but the extension was still used
by speakers of many other languages. The promotional materials about the studies were released
in English, Dutch, French, German, Spanish and Brazilian Portuguese.
The Viu Politica videos are primarily in Portuguese and that study was exclusively in Portuguese.
## Dataset Structure
Data includes “regrets” and “recommendations”:
- Regrets are videos that our study participants considered undesirable in some sense:
- In the first study, they are videos for which the participant pressed the “Report Regret” toolbar button.
- In the second study, they are videos for which the participant pressed the “Stop Recommending” button overlayed on the video player or thumbnail.
- Recommendations are videos recommended to our participants by YouTube, either on the front YouTube page, or the sidebar of the video player page. Only videos loaded in the browser are included, so the “infinite scroll” of the sidebar is only included to the extent that it is loaded, which will depend on participant scrolling behavior.
### Data Fields
#### RegretsReporter 1: regrets
Number of rows: 4,760
Total logical bytes: 12.03 MB
The data contains the following fields:
- submission_date - Date of regret
- submission_country - Country from which regret was sent according to IPGeo lookup.
- regret_video_id - YouTube video ID
- regret_video_title - Title of video
- regret_video_description - Description of video (from YouTube)
- regret_video_view_count - View count of video at time of regret
- video_post_date - Posting date of video
#### RegretsReporter 2 (user control study): regrets
Number of rows: 20,633
Total logical bytes: 22 MB
The data contains the following fields:
- submission_date - Date of regret
- submission_country - Country from which regret was sent according to IPGeo lookup.
- regret_video_id - YouTube video ID
- regret_video_title - Title of video
- regret_video_channel - Video Channel (canonicalized)
- regret_video_description - Description of video (from YouTube)
- regret_video_view_count - View count of video at time of regret
#### RegretsReporter 2 (user control study): recommendations
Number of rows: 96,001,836
Total logical bytes: 115.72 GB
The data contains the following fields:
- submission_date - Date of recommendation
- submission_country - Country from which report was sent according to IPGeo lookup.
- recommendation_video_id - YouTube video ID
- recommendation_video_title - Title of video
- recommendation_video_channel - Video Channel (canonicalized)
- recommendation_video_description - Description of video (from YouTube)
- recommendation_video_view_count - View count of video at time of regret
#### Viu Política: tagged videos
Number of rows: 1,248
Total logical bytes: 807.61 KB
The data contains the following fields:
- tagged_video_id - YouTube video ID
- submission_date - Date of video tagging
- video_title - Title of video
- view_count - View count of video at time of tagging
- video_like_count - Like count of video at time of tagging
- video_channel - Name of the YouTube channel where the video was posted
- video_channel_link - Link/ID of the YouTube channel where the video was posted
- video_full_description - YouTube description of tagged video
- video_thumbnail - Link to the thumbnail of the tagged video
#### Viu Política: recommended videos
Number of rows: 294,697
Total logical bytes: 245.31 MB
The data contains the following fields:
- recommended_video_id - YouTube video ID of recommended video
- submission_date - Date of video tagging
- recommended_video_title - Title of recommended video
- recommended_view_count - View count of recommended video at time of tagging
- recommended_video_like_count - Like count of recommended video at time of tagging
- recommended_video_channel - Name of the YouTube channel where the recommended video was posted
- recommended_video_channel_link - Link/ID of the YouTube channel where the recommended video was posted
- recommended_video_full_description - YouTube description of recommended video
- recommended_video_thumbnail - Link to the thumbnail of the recommended video
#### Viu Política: collected videos
Number of rows: 23,245
Total logical bytes: 11.59 MB
The data contains the following fields:
- collected_video_id - YouTube video ID of collected video
- collected_video_title - Title of collected video
- collected_view_count - View count of collected video at time of tagging
- collected_video_like_count - Like count of collected video at time of tagging
- collected_video_channel - Name of the YouTube channel where the collected video was posted
- collected_video_channel_link - Link/ID of the YouTube channel where the collected video was posted
- collected_video_full_description - YouTube description of collected video
- collected_video_thumbnail - Link to the thumbnail of the collected video
#### Viu Política: video transcriptions
Total number of .txt files containing video transcriptions: 33,054
Total logical bytes: 735.62 MB
## Caveats
### RegretsReporter Data
- Our participants are not representative of YouTube users in general:
- We promoted the study to the Mozilla network, which has a particular bias, although some paid promotion was employed and, in the first study, we did specifically target demographics that are underrepresented in our community.
- All participants were aware their data would be collected during the study, which could have influenced their behavior.
- The UX of the extensions was only available in English, which probably biased our respondents towards English-speakers, although it appears that extensions were still used by some non-English-speakers.
- The extension was only available for the desktop versions of Firefox and Chrome.
- Regrets are subjective and we have not systematically filtered inauthentic use.
- Recommendations are only recorded and regrets are only possible on the desktop device for which the extension was installed.
- Country data is based on ipgeo lookup and may have systematic errors. It is occasionally missing when lookup fails
- The video metadata (title, channel, etc.) is often missing as it was only acquired for a subset of the data.
### Viu Politica Data
- The participants are not representative of YouTube users in general:
- The promotion of our study relied on digital influencers and researchers, press coverage, construction of a personalized landing page, and social media promotion, with a focus on Twitter and Instagram, as well as through WhatsApp and email – all of which will inevitably have a particular bias when it comes to the users who respond to it.
- All participants were aware their data would be collected during the study, which could have influenced their behavior.
- The UX of the extension was only available in Brazilian Portuguese, and advertised to Brazilian audiences.
- The extension was only available for the desktop versions of Firefox and Chrome.
- Whether a given video is political is a subjective decision and we have not systematically filtered inauthentic use.
|
jinwoos/cartoonizer-dataset-450 | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 7344682503.0
num_examples: 450
download_size: 7344217953
dataset_size: 7344682503.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Jacobvs/PoliticalTweets | ---
license: mit
---
|
jhu-clsp/FollowIR-train | ---
license: apache-2.0
language:
- en
tags:
- retrieval
- information retrieval
pretty_name: FollowIR-train
size_categories:
- 1K<n<10K
---
# Dataset Summary
FollowIR-train contains ~1800 query and instruction pairs, with labels for relevance (true or false). It can be used to train retrieval models to better follow instructions (see [FollowIR-7B](https://huggingface.co/jhu-clsp/FollowIR-7B)).
The dataset was created by taking instruction and query pairs from all [TREC tracks](https://trec.nist.gov/) (which provides instructions as "narratives") from 1993-on that provided these instructions. Synthetic documents were then created from GPT-3.5-Turbo-1106 and filtered using Mistral-Instruct-7B-v0.2. This dataset contains the filtered instructions only. See [jhu-clsp/FollowIR-train-raw]() for the raw data before filtering.
- **Repository:** [orionw/FollowIR](https://github.com/orionw/FollowIR)
- **Paper:** https://arxiv.org/abs/2403.15246
- **Model Trained on the Dataset:** [jhu-clsp/FollowIR-7B](https://huggingface.co/jhu-clsp/FollowIR-7B/)
The structure of the dataset is as follows:
```
{
"score": the score from Mistral-Instruct-7B-v0.2 of whether it was relevant or not (1 is relevant, 0 is not)
"label": the label of relevance from GPT-3.5-Turbo-1106 who created the document
"id": the id from the original TREC track and the file it came from
"document": the synthetic document produced by GPT-3.5-Turbo-1106 given the original instruction, query, and label
"query": the query written by TREC
"instruction": the instruction (or narrative) written by TREC for human annotation
}
```
# Citation
```bibtex
@misc{weller2024followir,
title={FollowIR: Evaluating and Teaching Information Retrieval Models to Follow Instructions},
author={Orion Weller and Benjamin Chang and Sean MacAvaney and Kyle Lo and Arman Cohan and Benjamin Van Durme and Dawn Lawrie and Luca Soldaini},
year={2024},
eprint={2403.15246},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
``` |
fjd/scannet_sample | ---
license: cc-by-nc-nd-4.0
---
|
open-llm-leaderboard/details_andysalerno__openchat-nectar-0.6 | ---
pretty_name: Evaluation run of andysalerno/openchat-nectar-0.6
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [andysalerno/openchat-nectar-0.6](https://huggingface.co/andysalerno/openchat-nectar-0.6)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.6\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T22:42:05.563156](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.6/blob/main/results_2024-01-16T22-42-05.563156.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546898298965087,\n\
\ \"acc_stderr\": 0.031907604367501376,\n \"acc_norm\": 0.6552224867949306,\n\
\ \"acc_norm_stderr\": 0.03256520701245893,\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.5190433843192046,\n\
\ \"mc2_stderr\": 0.01538697013474084\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042201,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441372\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6346345349531965,\n\
\ \"acc_stderr\": 0.004805483767055348,\n \"acc_norm\": 0.8322047400916153,\n\
\ \"acc_norm_stderr\": 0.0037292066767701934\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.02315787934908353,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.02315787934908353\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465718,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465718\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n \"acc_norm\"\
: 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n\
\ \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n\
\ \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944867,\n\
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944867\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.01448750085285042,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.01448750085285042\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n\
\ \"acc_stderr\": 0.0127686730761119,\n \"acc_norm\": 0.4921773142112125,\n\
\ \"acc_norm_stderr\": 0.0127686730761119\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789503,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789503\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.5190433843192046,\n\
\ \"mc2_stderr\": 0.01538697013474084\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \
\ \"acc_stderr\": 0.012652544133186132\n }\n}\n```"
repo_url: https://huggingface.co/andysalerno/openchat-nectar-0.6
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|arc:challenge|25_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|gsm8k|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hellaswag|10_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T22-42-05.563156.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T22-42-05.563156.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- '**/details_harness|winogrande|5_2024-01-16T22-42-05.563156.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T22-42-05.563156.parquet'
- config_name: results
data_files:
- split: 2024_01_16T22_42_05.563156
path:
- results_2024-01-16T22-42-05.563156.parquet
- split: latest
path:
- results_2024-01-16T22-42-05.563156.parquet
---
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.6](https://huggingface.co/andysalerno/openchat-nectar-0.6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T22:42:05.563156](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.6/blob/main/results_2024-01-16T22-42-05.563156.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6546898298965087,
"acc_stderr": 0.031907604367501376,
"acc_norm": 0.6552224867949306,
"acc_norm_stderr": 0.03256520701245893,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.016750862381375898,
"mc2": 0.5190433843192046,
"mc2_stderr": 0.01538697013474084
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042201,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441372
},
"harness|hellaswag|10": {
"acc": 0.6346345349531965,
"acc_stderr": 0.004805483767055348,
"acc_norm": 0.8322047400916153,
"acc_norm_stderr": 0.0037292066767701934
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465718,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465718
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.0340763209385405,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.0340763209385405
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944867,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944867
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.01448750085285042,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.01448750085285042
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4921773142112125,
"acc_stderr": 0.0127686730761119,
"acc_norm": 0.4921773142112125,
"acc_norm_stderr": 0.0127686730761119
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789503,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789503
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.016750862381375898,
"mc2": 0.5190433843192046,
"mc2_stderr": 0.01538697013474084
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.012652544133186132
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BeIR/climate-fever | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
CyberHarem/queen_medb_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of queen_medb/女王メイヴ/女王梅芙 (Fate/Grand Order)
This is the dataset of queen_medb/女王メイヴ/女王梅芙 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `pink_hair, long_hair, yellow_eyes, breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 630.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/queen_medb_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 566.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/queen_medb_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1176 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/queen_medb_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/queen_medb_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, elbow_gloves, holding_whip, looking_at_viewer, riding_crop, solo, tiara, white_gloves, smile, white_skirt, :q, blush |
| 1 | 10 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, smile, solo, tiara, white_gloves, cleavage, holding_whip, navel, riding_crop, white_bra, frills, white_skirt, blunt_bangs, sitting, curtains, midriff, crossed_legs, knee_boots, tongue_out |
| 2 | 6 |  |  |  |  |  | 1girl, cleavage, cowboy_shot, elbow_gloves, holding_whip, looking_at_viewer, midriff, navel, riding_crop, solo, tiara, white_gloves, white_skirt, open_mouth, stomach, :d, white_bra, blunt_bangs, frilled_skirt, simple_background, very_long_hair |
| 3 | 9 |  |  |  |  |  | 1girl, cleavage, smile, solo, tiara, white_gloves, blush, elbow_gloves, looking_at_viewer, navel, white_bra, white_skirt, frills, midriff, very_long_hair, cowboy_shot, brown_eyes, open_mouth |
| 4 | 13 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, peaked_cap, solo, holding_whip, midriff, navel, red_skirt, riding_crop, smile, cleavage, red_gloves, thighhighs, frilled_skirt, miniskirt, cowboy_shot, standing, stomach, blush, bra, simple_background, crop_top, sidelocks, tongue_out |
| 5 | 21 |  |  |  |  |  | 1girl, peaked_cap, looking_at_viewer, white_gloves, solo, riding_crop, military_uniform, smile, holding_whip, skirt, long_sleeves, simple_background, white_background, ascot |
| 6 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, tiara, blush, collarbone, completely_nude, navel, nipples, simple_background, white_background, pussy, closed_mouth, large_breasts, uncensored |
| 7 | 19 |  |  |  |  |  | 1girl, blush, hetero, tiara, nipples, penis, 1boy, sex, sweat, pussy, vaginal, large_breasts, thighs, navel, open_mouth, smile, mosaic_censoring, looking_at_viewer, completely_nude, spread_legs, girl_on_top, lying, sidelocks, collarbone, cowgirl_position |
| 8 | 7 |  |  |  |  |  | 1girl, solo, bare_shoulders, looking_at_viewer, simple_background, smile, tiara, short_dress, white_background, white_dress, hairband, closed_mouth, sleeveless_dress |
| 9 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, tiara, bare_shoulders, black_bikini, full_body, navel, smile, sandals, simple_background, white_background, cleavage, closed_mouth, standing |
| 10 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, tiara, white_bikini, cleavage, twintails, day, bare_shoulders, blue_sky, choker, navel, blush, outdoors, jewelry, open_mouth, sidelocks, :d, armpits, collarbone, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | elbow_gloves | holding_whip | looking_at_viewer | riding_crop | solo | tiara | white_gloves | smile | white_skirt | :q | blush | cleavage | navel | white_bra | frills | blunt_bangs | sitting | curtains | midriff | crossed_legs | knee_boots | tongue_out | cowboy_shot | open_mouth | stomach | :d | frilled_skirt | simple_background | very_long_hair | brown_eyes | peaked_cap | red_skirt | red_gloves | thighhighs | miniskirt | standing | bra | crop_top | sidelocks | military_uniform | skirt | long_sleeves | white_background | ascot | collarbone | completely_nude | nipples | pussy | closed_mouth | large_breasts | uncensored | hetero | penis | 1boy | sex | sweat | vaginal | thighs | mosaic_censoring | spread_legs | girl_on_top | lying | cowgirl_position | bare_shoulders | short_dress | white_dress | hairband | sleeveless_dress | black_bikini | full_body | sandals | white_bikini | twintails | day | blue_sky | choker | outdoors | jewelry | armpits |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:---------------|:--------------------|:--------------|:-------|:--------|:---------------|:--------|:--------------|:-----|:--------|:-----------|:--------|:------------|:---------|:--------------|:----------|:-----------|:----------|:---------------|:-------------|:-------------|:--------------|:-------------|:----------|:-----|:----------------|:--------------------|:-----------------|:-------------|:-------------|:------------|:-------------|:-------------|:------------|:-----------|:------|:-----------|:------------|:-------------------|:--------|:---------------|:-------------------|:--------|:-------------|:------------------|:----------|:--------|:---------------|:----------------|:-------------|:---------|:--------|:-------|:------|:--------|:----------|:---------|:-------------------|:--------------|:--------------|:--------|:-------------------|:-----------------|:--------------|:--------------|:-----------|:-------------------|:---------------|:------------|:----------|:---------------|:------------|:------|:-----------|:---------|:-----------|:----------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | X | X | X | | X | | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | | X | | X | X | X | X | X | | X | X | X | X | X | | | | X | | | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | X | X | X | | | | | | X | | | X | X | | X | | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 21 |  |  |  |  |  | X | | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | | X | | X | X | | X | | | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 19 |  |  |  |  |  | X | | | X | | | X | | X | | | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | | X | X | | X | | | | X | X | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | X | | | | | X | X | X | | | | | | | | |
| 10 | 10 |  |  |  |  |  | X | | | X | | X | X | | | | | X | X | X | | | | | | | | | | | X | | X | | | X | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X |
|
Ketan8010/mini-jpmorgan-1 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 320463
num_examples: 914
download_size: 106488
dataset_size: 320463
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mukesh3444/manual-window-detect | ---
license: apache-2.0
---
|
baaaaaaaam/koalpaca | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: inputs
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2835767
num_examples: 2308
download_size: 460567
dataset_size: 2835767
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
winstonp/test-dataset | ---
license: other
---
|
another-symato/VMTEB-vietnamese_students_feedback_topic | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1023776
num_examples: 11426
- name: validation
num_bytes: 136041
num_examples: 1583
- name: test
num_bytes: 282675
num_examples: 3166
download_size: 660583
dataset_size: 1442492
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
vmadhav/mydataset | ---
license: mit
---
|
open-llm-leaderboard/details_Ppoyaa__Alpha-Mistral-7B-Instruct | ---
pretty_name: Evaluation run of Ppoyaa/Alpha-Mistral-7B-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Ppoyaa/Alpha-Mistral-7B-Instruct](https://huggingface.co/Ppoyaa/Alpha-Mistral-7B-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ppoyaa__Alpha-Mistral-7B-Instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T11:01:11.453575](https://huggingface.co/datasets/open-llm-leaderboard/details_Ppoyaa__Alpha-Mistral-7B-Instruct/blob/main/results_2024-04-15T11-01-11.453575.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5292991804932395,\n\
\ \"acc_stderr\": 0.03414202331308852,\n \"acc_norm\": 0.5333047977390053,\n\
\ \"acc_norm_stderr\": 0.03486011046426882,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5296347812834613,\n\
\ \"mc2_stderr\": 0.015236773829187759\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068282\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6070503883688508,\n\
\ \"acc_stderr\": 0.004874076250521577,\n \"acc_norm\": 0.8149770961959769,\n\
\ \"acc_norm_stderr\": 0.0038752253693657315\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424648,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424648\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.041049472699033945,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.041049472699033945\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36451612903225805,\n\
\ \"acc_stderr\": 0.027379871229943252,\n \"acc_norm\": 0.36451612903225805,\n\
\ \"acc_norm_stderr\": 0.027379871229943252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398394,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7302752293577982,\n \"acc_stderr\": 0.01902848671111544,\n \"\
acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.01902848671111544\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631296,\n \"\
acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631296\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.03434131164719129,\n \"\
acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.03434131164719129\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.02969633871342288,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.02969633871342288\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6012269938650306,\n \"acc_stderr\": 0.03847021420456023,\n\
\ \"acc_norm\": 0.6012269938650306,\n \"acc_norm_stderr\": 0.03847021420456023\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041695,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041695\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922754,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922754\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n\
\ \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.7381864623243933,\n\
\ \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.026511261369409244,\n\
\ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.026511261369409244\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n\
\ \"acc_stderr\": 0.01606229067111047,\n \"acc_norm\": 0.36089385474860336,\n\
\ \"acc_norm_stderr\": 0.01606229067111047\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891772,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n\
\ \"acc_stderr\": 0.027982680459759567,\n \"acc_norm\": 0.5852090032154341,\n\
\ \"acc_norm_stderr\": 0.027982680459759567\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.378748370273794,\n\
\ \"acc_stderr\": 0.012389052105003741,\n \"acc_norm\": 0.378748370273794,\n\
\ \"acc_norm_stderr\": 0.012389052105003741\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.545751633986928,\n \"acc_stderr\": 0.020142974553795205,\n \
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.020142974553795205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3880597014925373,\n\
\ \"acc_stderr\": 0.03445789964362749,\n \"acc_norm\": 0.3880597014925373,\n\
\ \"acc_norm_stderr\": 0.03445789964362749\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5296347812834613,\n\
\ \"mc2_stderr\": 0.015236773829187759\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3100833965125095,\n \
\ \"acc_stderr\": 0.012740305717376268\n }\n}\n```"
repo_url: https://huggingface.co/Ppoyaa/Alpha-Mistral-7B-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|arc:challenge|25_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|gsm8k|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hellaswag|10_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-01-11.453575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T11-01-11.453575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- '**/details_harness|winogrande|5_2024-04-15T11-01-11.453575.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T11-01-11.453575.parquet'
- config_name: results
data_files:
- split: 2024_04_15T11_01_11.453575
path:
- results_2024-04-15T11-01-11.453575.parquet
- split: latest
path:
- results_2024-04-15T11-01-11.453575.parquet
---
# Dataset Card for Evaluation run of Ppoyaa/Alpha-Mistral-7B-Instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Ppoyaa/Alpha-Mistral-7B-Instruct](https://huggingface.co/Ppoyaa/Alpha-Mistral-7B-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Ppoyaa__Alpha-Mistral-7B-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T11:01:11.453575](https://huggingface.co/datasets/open-llm-leaderboard/details_Ppoyaa__Alpha-Mistral-7B-Instruct/blob/main/results_2024-04-15T11-01-11.453575.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5292991804932395,
"acc_stderr": 0.03414202331308852,
"acc_norm": 0.5333047977390053,
"acc_norm_stderr": 0.03486011046426882,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5296347812834613,
"mc2_stderr": 0.015236773829187759
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068282
},
"harness|hellaswag|10": {
"acc": 0.6070503883688508,
"acc_stderr": 0.004874076250521577,
"acc_norm": 0.8149770961959769,
"acc_norm_stderr": 0.0038752253693657315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424648,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424648
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.041049472699033945,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.041049472699033945
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36451612903225805,
"acc_stderr": 0.027379871229943252,
"acc_norm": 0.36451612903225805,
"acc_norm_stderr": 0.027379871229943252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398394,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.01902848671111544,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.01902848671111544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.03225941352631296,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.03225941352631296
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.02969633871342288,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.02969633871342288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6012269938650306,
"acc_stderr": 0.03847021420456023,
"acc_norm": 0.6012269938650306,
"acc_norm_stderr": 0.03847021420456023
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041695,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041695
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922754,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.026511261369409244,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.026511261369409244
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36089385474860336,
"acc_stderr": 0.01606229067111047,
"acc_norm": 0.36089385474860336,
"acc_norm_stderr": 0.01606229067111047
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891772,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759567,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759567
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.378748370273794,
"acc_stderr": 0.012389052105003741,
"acc_norm": 0.378748370273794,
"acc_norm_stderr": 0.012389052105003741
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032939,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032939
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.020142974553795205,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.020142974553795205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3880597014925373,
"acc_stderr": 0.03445789964362749,
"acc_norm": 0.3880597014925373,
"acc_norm_stderr": 0.03445789964362749
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5296347812834613,
"mc2_stderr": 0.015236773829187759
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.011850040124850508
},
"harness|gsm8k|5": {
"acc": 0.3100833965125095,
"acc_stderr": 0.012740305717376268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zcahjl3/test_2024_1_4 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: CoT_example
dtype: string
- name: example_embeddings
sequence: float32
splits:
- name: train
num_bytes: 415916
num_examples: 100
download_size: 528195
dataset_size: 415916
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gowitheflowlab/parallel-small | ---
dataset_info:
features:
- name: English
dtype: string
- name: Other Language
dtype: string
splits:
- name: train
num_bytes: 961089941
num_examples: 6330819
download_size: 585896345
dataset_size: 961089941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Trelis/llm-lingo | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: start_time
dtype: string
- name: end_time
dtype: string
splits:
- name: train
num_bytes: 1273208.0
num_examples: 6
- name: validation
num_bytes: 1229380.0
num_examples: 5
download_size: 2508852
dataset_size: 2502588.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
bbbh/a | ---
license: other
---
|
apapiu/simple_text_files | ---
license: apache-2.0
---
|
sethapun/arithmetic_2all_1to250 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: float64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 60348
num_examples: 2000
- name: validation
num_bytes: 12054
num_examples: 400
download_size: 30282
dataset_size: 72402
---
# Dataset Card for "arithmetic_2all_1to250"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hanifabdlh/Setfit-Multi-Duplicate-Sample-Dataset-Experiment | ---
dataset_info:
features:
- name: sample_text
dtype: string
- name: label
dtype:
class_label:
names:
'0': BXGZhBRF_affirm
'1': BXGZhBRF_bot_challenge
'2': BXGZhBRF_deny
'3': BXGZhBRF_goodbye
'4': BXGZhBRF_greet
'5': BXGZhBRF_mood_great
'6': BXGZhBRF_mood_unhappy
'7': BkLgmsGS_affirm
'8': BkLgmsGS_bot_challenge
'9': BkLgmsGS_deny
'10': BkLgmsGS_goodbye
'11': BkLgmsGS_greet
'12': BkLgmsGS_mood_great
'13': BkLgmsGS_mood_unhappy
'14': BsBngPYk_affirm
'15': BsBngPYk_bot_challenge
'16': BsBngPYk_deny
'17': BsBngPYk_goodbye
'18': BsBngPYk_greet
'19': BsBngPYk_mood_great
'20': BsBngPYk_mood_unhappy
'21': BtlTrgSr_affirm
'22': BtlTrgSr_bot_challenge
'23': BtlTrgSr_deny
'24': BtlTrgSr_goodbye
'25': BtlTrgSr_greet
'26': BtlTrgSr_mood_great
'27': BtlTrgSr_mood_unhappy
'28': BzjnBHNx_affirm
'29': BzjnBHNx_bot_challenge
'30': BzjnBHNx_deny
'31': BzjnBHNx_goodbye
'32': BzjnBHNx_greet
'33': BzjnBHNx_mood_great
'34': BzjnBHNx_mood_unhappy
'35': CDJmtQHT_affirm
'36': CDJmtQHT_bot_challenge
'37': CDJmtQHT_deny
'38': CDJmtQHT_goodbye
'39': CDJmtQHT_greet
'40': CDJmtQHT_mood_great
'41': CDJmtQHT_mood_unhappy
'42': CJfxWyCj_affirm
'43': CJfxWyCj_bot_challenge
'44': CJfxWyCj_deny
'45': CJfxWyCj_goodbye
'46': CJfxWyCj_greet
'47': CJfxWyCj_mood_great
'48': CJfxWyCj_mood_unhappy
'49': CLwyvskq_affirm
'50': CLwyvskq_bot_challenge
'51': CLwyvskq_deny
'52': CLwyvskq_goodbye
'53': CLwyvskq_greet
'54': CLwyvskq_mood_great
'55': CLwyvskq_mood_unhappy
'56': CVyYJvLy_affirm
'57': CVyYJvLy_bot_challenge
'58': CVyYJvLy_deny
'59': CVyYJvLy_goodbye
'60': CVyYJvLy_greet
'61': CVyYJvLy_mood_great
'62': CVyYJvLy_mood_unhappy
'63': DMSQCJSB_affirm
'64': DMSQCJSB_bot_challenge
'65': DMSQCJSB_deny
'66': DMSQCJSB_goodbye
'67': DMSQCJSB_greet
'68': DMSQCJSB_mood_great
'69': DMSQCJSB_mood_unhappy
'70': DPVjKPsX_affirm
'71': DPVjKPsX_bot_challenge
'72': DPVjKPsX_deny
'73': DPVjKPsX_goodbye
'74': DPVjKPsX_greet
'75': DPVjKPsX_mood_great
'76': DPVjKPsX_mood_unhappy
'77': DQPwnvMx_affirm
'78': DQPwnvMx_bot_challenge
'79': DQPwnvMx_deny
'80': DQPwnvMx_goodbye
'81': DQPwnvMx_greet
'82': DQPwnvMx_mood_great
'83': DQPwnvMx_mood_unhappy
'84': DhGbvRkL_affirm
'85': DhGbvRkL_bot_challenge
'86': DhGbvRkL_deny
'87': DhGbvRkL_goodbye
'88': DhGbvRkL_greet
'89': DhGbvRkL_mood_great
'90': DhGbvRkL_mood_unhappy
'91': DpybRlwd_affirm
'92': DpybRlwd_bot_challenge
'93': DpybRlwd_deny
'94': DpybRlwd_goodbye
'95': DpybRlwd_greet
'96': DpybRlwd_mood_great
'97': DpybRlwd_mood_unhappy
'98': DrKKtDjD_affirm
'99': DrKKtDjD_bot_challenge
'100': DrKKtDjD_deny
'101': DrKKtDjD_goodbye
'102': DrKKtDjD_greet
'103': DrKKtDjD_mood_great
'104': DrKKtDjD_mood_unhappy
'105': DvVFYDjs_affirm
'106': DvVFYDjs_bot_challenge
'107': DvVFYDjs_deny
'108': DvVFYDjs_goodbye
'109': DvVFYDjs_greet
'110': DvVFYDjs_mood_great
'111': DvVFYDjs_mood_unhappy
'112': FKKyVJyT_affirm
'113': FKKyVJyT_bot_challenge
'114': FKKyVJyT_deny
'115': FKKyVJyT_goodbye
'116': FKKyVJyT_greet
'117': FKKyVJyT_mood_great
'118': FKKyVJyT_mood_unhappy
'119': FRsPSCgG_affirm
'120': FRsPSCgG_bot_challenge
'121': FRsPSCgG_deny
'122': FRsPSCgG_goodbye
'123': FRsPSCgG_greet
'124': FRsPSCgG_mood_great
'125': FRsPSCgG_mood_unhappy
'126': GJjNKghz_affirm
'127': GJjNKghz_bot_challenge
'128': GJjNKghz_deny
'129': GJjNKghz_goodbye
'130': GJjNKghz_greet
'131': GJjNKghz_mood_great
'132': GJjNKghz_mood_unhappy
'133': GSdLhzvz_affirm
'134': GSdLhzvz_bot_challenge
'135': GSdLhzvz_deny
'136': GSdLhzvz_goodbye
'137': GSdLhzvz_greet
'138': GSdLhzvz_mood_great
'139': GSdLhzvz_mood_unhappy
'140': HKWMxdsx_affirm
'141': HKWMxdsx_bot_challenge
'142': HKWMxdsx_deny
'143': HKWMxdsx_goodbye
'144': HKWMxdsx_greet
'145': HKWMxdsx_mood_great
'146': HKWMxdsx_mood_unhappy
'147': HZjNXHwd_affirm
'148': HZjNXHwd_bot_challenge
'149': HZjNXHwd_deny
'150': HZjNXHwd_goodbye
'151': HZjNXHwd_greet
'152': HZjNXHwd_mood_great
'153': HZjNXHwd_mood_unhappy
'154': HwrPkCJk_affirm
'155': HwrPkCJk_bot_challenge
'156': HwrPkCJk_deny
'157': HwrPkCJk_goodbye
'158': HwrPkCJk_greet
'159': HwrPkCJk_mood_great
'160': HwrPkCJk_mood_unhappy
'161': JKsHBLGH_affirm
'162': JKsHBLGH_bot_challenge
'163': JKsHBLGH_deny
'164': JKsHBLGH_goodbye
'165': JKsHBLGH_greet
'166': JKsHBLGH_mood_great
'167': JKsHBLGH_mood_unhappy
'168': JkRsZmwm_affirm
'169': JkRsZmwm_bot_challenge
'170': JkRsZmwm_deny
'171': JkRsZmwm_goodbye
'172': JkRsZmwm_greet
'173': JkRsZmwm_mood_great
'174': JkRsZmwm_mood_unhappy
'175': KhWdJzWv_affirm
'176': KhWdJzWv_bot_challenge
'177': KhWdJzWv_deny
'178': KhWdJzWv_goodbye
'179': KhWdJzWv_greet
'180': KhWdJzWv_mood_great
'181': KhWdJzWv_mood_unhappy
'182': Knycdzfx_affirm
'183': Knycdzfx_bot_challenge
'184': Knycdzfx_deny
'185': Knycdzfx_goodbye
'186': Knycdzfx_greet
'187': Knycdzfx_mood_great
'188': Knycdzfx_mood_unhappy
'189': LhrZYWcf_affirm
'190': LhrZYWcf_bot_challenge
'191': LhrZYWcf_deny
'192': LhrZYWcf_goodbye
'193': LhrZYWcf_greet
'194': LhrZYWcf_mood_great
'195': LhrZYWcf_mood_unhappy
'196': MCBSRFLz_affirm
'197': MCBSRFLz_bot_challenge
'198': MCBSRFLz_deny
'199': MCBSRFLz_goodbye
'200': MCBSRFLz_greet
'201': MCBSRFLz_mood_great
'202': MCBSRFLz_mood_unhappy
'203': MFFZmNdl_affirm
'204': MFFZmNdl_bot_challenge
'205': MFFZmNdl_deny
'206': MFFZmNdl_goodbye
'207': MFFZmNdl_greet
'208': MFFZmNdl_mood_great
'209': MFFZmNdl_mood_unhappy
'210': McXSzXsZ_affirm
'211': McXSzXsZ_bot_challenge
'212': McXSzXsZ_deny
'213': McXSzXsZ_goodbye
'214': McXSzXsZ_greet
'215': McXSzXsZ_mood_great
'216': McXSzXsZ_mood_unhappy
'217': NVmsLYvL_affirm
'218': NVmsLYvL_bot_challenge
'219': NVmsLYvL_deny
'220': NVmsLYvL_goodbye
'221': NVmsLYvL_greet
'222': NVmsLYvL_mood_great
'223': NVmsLYvL_mood_unhappy
'224': PVBjKjQL_affirm
'225': PVBjKjQL_bot_challenge
'226': PVBjKjQL_deny
'227': PVBjKjQL_goodbye
'228': PVBjKjQL_greet
'229': PVBjKjQL_mood_great
'230': PVBjKjQL_mood_unhappy
'231': RZCfGwqk_affirm
'232': RZCfGwqk_bot_challenge
'233': RZCfGwqk_deny
'234': RZCfGwqk_goodbye
'235': RZCfGwqk_greet
'236': RZCfGwqk_mood_great
'237': RZCfGwqk_mood_unhappy
'238': SDkwTsJS_affirm
'239': SDkwTsJS_bot_challenge
'240': SDkwTsJS_deny
'241': SDkwTsJS_goodbye
'242': SDkwTsJS_greet
'243': SDkwTsJS_mood_great
'244': SDkwTsJS_mood_unhappy
'245': SHhskLkZ_affirm
'246': SHhskLkZ_bot_challenge
'247': SHhskLkZ_deny
'248': SHhskLkZ_goodbye
'249': SHhskLkZ_greet
'250': SHhskLkZ_mood_great
'251': SHhskLkZ_mood_unhappy
'252': SfVSSBkH_affirm
'253': SfVSSBkH_bot_challenge
'254': SfVSSBkH_deny
'255': SfVSSBkH_goodbye
'256': SfVSSBkH_greet
'257': SfVSSBkH_mood_great
'258': SfVSSBkH_mood_unhappy
'259': SzFWQxLP_affirm
'260': SzFWQxLP_bot_challenge
'261': SzFWQxLP_deny
'262': SzFWQxLP_goodbye
'263': SzFWQxLP_greet
'264': SzFWQxLP_mood_great
'265': SzFWQxLP_mood_unhappy
'266': TGYHxkGc_affirm
'267': TGYHxkGc_bot_challenge
'268': TGYHxkGc_deny
'269': TGYHxkGc_goodbye
'270': TGYHxkGc_greet
'271': TGYHxkGc_mood_great
'272': TGYHxkGc_mood_unhappy
'273': TGgmYchQ_affirm
'274': TGgmYchQ_bot_challenge
'275': TGgmYchQ_deny
'276': TGgmYchQ_goodbye
'277': TGgmYchQ_greet
'278': TGgmYchQ_mood_great
'279': TGgmYchQ_mood_unhappy
'280': TPrmMsjC_affirm
'281': TPrmMsjC_bot_challenge
'282': TPrmMsjC_deny
'283': TPrmMsjC_goodbye
'284': TPrmMsjC_greet
'285': TPrmMsjC_mood_great
'286': TPrmMsjC_mood_unhappy
'287': TxMMKBpf_affirm
'288': TxMMKBpf_bot_challenge
'289': TxMMKBpf_deny
'290': TxMMKBpf_goodbye
'291': TxMMKBpf_greet
'292': TxMMKBpf_mood_great
'293': TxMMKBpf_mood_unhappy
'294': TxxcGwbT_affirm
'295': TxxcGwbT_bot_challenge
'296': TxxcGwbT_deny
'297': TxxcGwbT_goodbye
'298': TxxcGwbT_greet
'299': TxxcGwbT_mood_great
'300': TxxcGwbT_mood_unhappy
'301': VJYNhMCH_affirm
'302': VJYNhMCH_bot_challenge
'303': VJYNhMCH_deny
'304': VJYNhMCH_goodbye
'305': VJYNhMCH_greet
'306': VJYNhMCH_mood_great
'307': VJYNhMCH_mood_unhappy
'308': VQWxjqxs_affirm
'309': VQWxjqxs_bot_challenge
'310': VQWxjqxs_deny
'311': VQWxjqxs_goodbye
'312': VQWxjqxs_greet
'313': VQWxjqxs_mood_great
'314': VQWxjqxs_mood_unhappy
'315': VrsBmWbw_affirm
'316': VrsBmWbw_bot_challenge
'317': VrsBmWbw_deny
'318': VrsBmWbw_goodbye
'319': VrsBmWbw_greet
'320': VrsBmWbw_mood_great
'321': VrsBmWbw_mood_unhappy
'322': WJjkZZSr_affirm
'323': WJjkZZSr_bot_challenge
'324': WJjkZZSr_deny
'325': WJjkZZSr_goodbye
'326': WJjkZZSr_greet
'327': WJjkZZSr_mood_great
'328': WJjkZZSr_mood_unhappy
'329': Wfbryrlv_affirm
'330': Wfbryrlv_bot_challenge
'331': Wfbryrlv_deny
'332': Wfbryrlv_goodbye
'333': Wfbryrlv_greet
'334': Wfbryrlv_mood_great
'335': Wfbryrlv_mood_unhappy
'336': WmglwXYV_affirm
'337': WmglwXYV_bot_challenge
'338': WmglwXYV_deny
'339': WmglwXYV_goodbye
'340': WmglwXYV_greet
'341': WmglwXYV_mood_great
'342': WmglwXYV_mood_unhappy
'343': XXZScWwc_affirm
'344': XXZScWwc_bot_challenge
'345': XXZScWwc_deny
'346': XXZScWwc_goodbye
'347': XXZScWwc_greet
'348': XXZScWwc_mood_great
'349': XXZScWwc_mood_unhappy
'350': XfXjrYrv_affirm
'351': XfXjrYrv_bot_challenge
'352': XfXjrYrv_deny
'353': XfXjrYrv_goodbye
'354': XfXjrYrv_greet
'355': XfXjrYrv_mood_great
'356': XfXjrYrv_mood_unhappy
'357': YfmmTyht_affirm
'358': YfmmTyht_bot_challenge
'359': YfmmTyht_deny
'360': YfmmTyht_goodbye
'361': YfmmTyht_greet
'362': YfmmTyht_mood_great
'363': YfmmTyht_mood_unhappy
'364': YqXvlYjY_affirm
'365': YqXvlYjY_bot_challenge
'366': YqXvlYjY_deny
'367': YqXvlYjY_goodbye
'368': YqXvlYjY_greet
'369': YqXvlYjY_mood_great
'370': YqXvlYjY_mood_unhappy
'371': YxshJjYj_affirm
'372': YxshJjYj_bot_challenge
'373': YxshJjYj_deny
'374': YxshJjYj_goodbye
'375': YxshJjYj_greet
'376': YxshJjYj_mood_great
'377': YxshJjYj_mood_unhappy
'378': ZdyKdVDl_affirm
'379': ZdyKdVDl_bot_challenge
'380': ZdyKdVDl_deny
'381': ZdyKdVDl_goodbye
'382': ZdyKdVDl_greet
'383': ZdyKdVDl_mood_great
'384': ZdyKdVDl_mood_unhappy
'385': affirm
'386': bCtVzcpv_affirm
'387': bCtVzcpv_bot_challenge
'388': bCtVzcpv_deny
'389': bCtVzcpv_goodbye
'390': bCtVzcpv_greet
'391': bCtVzcpv_mood_great
'392': bCtVzcpv_mood_unhappy
'393': bmHZmgMv_affirm
'394': bmHZmgMv_bot_challenge
'395': bmHZmgMv_deny
'396': bmHZmgMv_goodbye
'397': bmHZmgMv_greet
'398': bmHZmgMv_mood_great
'399': bmHZmgMv_mood_unhappy
'400': bot_challenge
'401': cYWtnySb_affirm
'402': cYWtnySb_bot_challenge
'403': cYWtnySb_deny
'404': cYWtnySb_goodbye
'405': cYWtnySb_greet
'406': cYWtnySb_mood_great
'407': cYWtnySb_mood_unhappy
'408': chsTVcwT_affirm
'409': chsTVcwT_bot_challenge
'410': chsTVcwT_deny
'411': chsTVcwT_goodbye
'412': chsTVcwT_greet
'413': chsTVcwT_mood_great
'414': chsTVcwT_mood_unhappy
'415': dJcLgzTJ_affirm
'416': dJcLgzTJ_bot_challenge
'417': dJcLgzTJ_deny
'418': dJcLgzTJ_goodbye
'419': dJcLgzTJ_greet
'420': dJcLgzTJ_mood_great
'421': dJcLgzTJ_mood_unhappy
'422': dQyLZcfM_affirm
'423': dQyLZcfM_bot_challenge
'424': dQyLZcfM_deny
'425': dQyLZcfM_goodbye
'426': dQyLZcfM_greet
'427': dQyLZcfM_mood_great
'428': dQyLZcfM_mood_unhappy
'429': ddyNkjtR_affirm
'430': ddyNkjtR_bot_challenge
'431': ddyNkjtR_deny
'432': ddyNkjtR_goodbye
'433': ddyNkjtR_greet
'434': ddyNkjtR_mood_great
'435': ddyNkjtR_mood_unhappy
'436': deny
'437': dpMQZKwp_affirm
'438': dpMQZKwp_bot_challenge
'439': dpMQZKwp_deny
'440': dpMQZKwp_goodbye
'441': dpMQZKwp_greet
'442': dpMQZKwp_mood_great
'443': dpMQZKwp_mood_unhappy
'444': fSxlYNjb_affirm
'445': fSxlYNjb_bot_challenge
'446': fSxlYNjb_deny
'447': fSxlYNjb_goodbye
'448': fSxlYNjb_greet
'449': fSxlYNjb_mood_great
'450': fSxlYNjb_mood_unhappy
'451': fWNxsWCs_affirm
'452': fWNxsWCs_bot_challenge
'453': fWNxsWCs_deny
'454': fWNxsWCs_goodbye
'455': fWNxsWCs_greet
'456': fWNxsWCs_mood_great
'457': fWNxsWCs_mood_unhappy
'458': fhwKJNWK_affirm
'459': fhwKJNWK_bot_challenge
'460': fhwKJNWK_deny
'461': fhwKJNWK_goodbye
'462': fhwKJNWK_greet
'463': fhwKJNWK_mood_great
'464': fhwKJNWK_mood_unhappy
'465': flHQkgMK_affirm
'466': flHQkgMK_bot_challenge
'467': flHQkgMK_deny
'468': flHQkgMK_goodbye
'469': flHQkgMK_greet
'470': flHQkgMK_mood_great
'471': flHQkgMK_mood_unhappy
'472': gYLyZHkl_affirm
'473': gYLyZHkl_bot_challenge
'474': gYLyZHkl_deny
'475': gYLyZHkl_goodbye
'476': gYLyZHkl_greet
'477': gYLyZHkl_mood_great
'478': gYLyZHkl_mood_unhappy
'479': gfnyWTxK_affirm
'480': gfnyWTxK_bot_challenge
'481': gfnyWTxK_deny
'482': gfnyWTxK_goodbye
'483': gfnyWTxK_greet
'484': gfnyWTxK_mood_great
'485': gfnyWTxK_mood_unhappy
'486': ghhhJjDl_affirm
'487': ghhhJjDl_bot_challenge
'488': ghhhJjDl_deny
'489': ghhhJjDl_goodbye
'490': ghhhJjDl_greet
'491': ghhhJjDl_mood_great
'492': ghhhJjDl_mood_unhappy
'493': goodbye
'494': greet
'495': hntpSgnb_affirm
'496': hntpSgnb_bot_challenge
'497': hntpSgnb_deny
'498': hntpSgnb_goodbye
'499': hntpSgnb_greet
'500': hntpSgnb_mood_great
'501': hntpSgnb_mood_unhappy
'502': hxzHpqlR_affirm
'503': hxzHpqlR_bot_challenge
'504': hxzHpqlR_deny
'505': hxzHpqlR_goodbye
'506': hxzHpqlR_greet
'507': hxzHpqlR_mood_great
'508': hxzHpqlR_mood_unhappy
'509': jzJzvDmM_affirm
'510': jzJzvDmM_bot_challenge
'511': jzJzvDmM_deny
'512': jzJzvDmM_goodbye
'513': jzJzvDmM_greet
'514': jzJzvDmM_mood_great
'515': jzJzvDmM_mood_unhappy
'516': kGPbRLHf_affirm
'517': kGPbRLHf_bot_challenge
'518': kGPbRLHf_deny
'519': kGPbRLHf_goodbye
'520': kGPbRLHf_greet
'521': kGPbRLHf_mood_great
'522': kGPbRLHf_mood_unhappy
'523': kLyGwPRN_affirm
'524': kLyGwPRN_bot_challenge
'525': kLyGwPRN_deny
'526': kLyGwPRN_goodbye
'527': kLyGwPRN_greet
'528': kLyGwPRN_mood_great
'529': kLyGwPRN_mood_unhappy
'530': kSHWWXsQ_affirm
'531': kSHWWXsQ_bot_challenge
'532': kSHWWXsQ_deny
'533': kSHWWXsQ_goodbye
'534': kSHWWXsQ_greet
'535': kSHWWXsQ_mood_great
'536': kSHWWXsQ_mood_unhappy
'537': lFLDRHVm_affirm
'538': lFLDRHVm_bot_challenge
'539': lFLDRHVm_deny
'540': lFLDRHVm_goodbye
'541': lFLDRHVm_greet
'542': lFLDRHVm_mood_great
'543': lFLDRHVm_mood_unhappy
'544': lgwHBPwb_affirm
'545': lgwHBPwb_bot_challenge
'546': lgwHBPwb_deny
'547': lgwHBPwb_goodbye
'548': lgwHBPwb_greet
'549': lgwHBPwb_mood_great
'550': lgwHBPwb_mood_unhappy
'551': mNxPwHMM_affirm
'552': mNxPwHMM_bot_challenge
'553': mNxPwHMM_deny
'554': mNxPwHMM_goodbye
'555': mNxPwHMM_greet
'556': mNxPwHMM_mood_great
'557': mNxPwHMM_mood_unhappy
'558': mlRvpkcl_affirm
'559': mlRvpkcl_bot_challenge
'560': mlRvpkcl_deny
'561': mlRvpkcl_goodbye
'562': mlRvpkcl_greet
'563': mlRvpkcl_mood_great
'564': mlRvpkcl_mood_unhappy
'565': mood_great
'566': mood_unhappy
'567': pnphXFXt_affirm
'568': pnphXFXt_bot_challenge
'569': pnphXFXt_deny
'570': pnphXFXt_goodbye
'571': pnphXFXt_greet
'572': pnphXFXt_mood_great
'573': pnphXFXt_mood_unhappy
'574': pwyBCbpm_affirm
'575': pwyBCbpm_bot_challenge
'576': pwyBCbpm_deny
'577': pwyBCbpm_goodbye
'578': pwyBCbpm_greet
'579': pwyBCbpm_mood_great
'580': pwyBCbpm_mood_unhappy
'581': qFMjtFTC_affirm
'582': qFMjtFTC_bot_challenge
'583': qFMjtFTC_deny
'584': qFMjtFTC_goodbye
'585': qFMjtFTC_greet
'586': qFMjtFTC_mood_great
'587': qFMjtFTC_mood_unhappy
'588': qQDXNYKf_affirm
'589': qQDXNYKf_bot_challenge
'590': qQDXNYKf_deny
'591': qQDXNYKf_goodbye
'592': qQDXNYKf_greet
'593': qQDXNYKf_mood_great
'594': qQDXNYKf_mood_unhappy
'595': qtTFsHdz_affirm
'596': qtTFsHdz_bot_challenge
'597': qtTFsHdz_deny
'598': qtTFsHdz_goodbye
'599': qtTFsHdz_greet
'600': qtTFsHdz_mood_great
'601': qtTFsHdz_mood_unhappy
'602': rFHLBnDy_affirm
'603': rFHLBnDy_bot_challenge
'604': rFHLBnDy_deny
'605': rFHLBnDy_goodbye
'606': rFHLBnDy_greet
'607': rFHLBnDy_mood_great
'608': rFHLBnDy_mood_unhappy
'609': rwSlxKNk_affirm
'610': rwSlxKNk_bot_challenge
'611': rwSlxKNk_deny
'612': rwSlxKNk_goodbye
'613': rwSlxKNk_greet
'614': rwSlxKNk_mood_great
'615': rwSlxKNk_mood_unhappy
'616': tJVGQhGy_affirm
'617': tJVGQhGy_bot_challenge
'618': tJVGQhGy_deny
'619': tJVGQhGy_goodbye
'620': tJVGQhGy_greet
'621': tJVGQhGy_mood_great
'622': tJVGQhGy_mood_unhappy
'623': tLSmFqWP_affirm
'624': tLSmFqWP_bot_challenge
'625': tLSmFqWP_deny
'626': tLSmFqWP_goodbye
'627': tLSmFqWP_greet
'628': tLSmFqWP_mood_great
'629': tLSmFqWP_mood_unhappy
'630': vMkgGNgT_affirm
'631': vMkgGNgT_bot_challenge
'632': vMkgGNgT_deny
'633': vMkgGNgT_goodbye
'634': vMkgGNgT_greet
'635': vMkgGNgT_mood_great
'636': vMkgGNgT_mood_unhappy
'637': vbLxRngK_affirm
'638': vbLxRngK_bot_challenge
'639': vbLxRngK_deny
'640': vbLxRngK_goodbye
'641': vbLxRngK_greet
'642': vbLxRngK_mood_great
'643': vbLxRngK_mood_unhappy
'644': xLgsvYVq_affirm
'645': xLgsvYVq_bot_challenge
'646': xLgsvYVq_deny
'647': xLgsvYVq_goodbye
'648': xLgsvYVq_greet
'649': xLgsvYVq_mood_great
'650': xLgsvYVq_mood_unhappy
'651': xVNQydsG_affirm
'652': xVNQydsG_bot_challenge
'653': xVNQydsG_deny
'654': xVNQydsG_goodbye
'655': xVNQydsG_greet
'656': xVNQydsG_mood_great
'657': xVNQydsG_mood_unhappy
'658': xyVVNmKR_affirm
'659': xyVVNmKR_bot_challenge
'660': xyVVNmKR_deny
'661': xyVVNmKR_goodbye
'662': xyVVNmKR_greet
'663': xyVVNmKR_mood_great
'664': xyVVNmKR_mood_unhappy
'665': xzkbNJpl_affirm
'666': xzkbNJpl_bot_challenge
'667': xzkbNJpl_deny
'668': xzkbNJpl_goodbye
'669': xzkbNJpl_greet
'670': xzkbNJpl_mood_great
'671': xzkbNJpl_mood_unhappy
'672': zTWhfmnX_affirm
'673': zTWhfmnX_bot_challenge
'674': zTWhfmnX_deny
'675': zTWhfmnX_goodbye
'676': zTWhfmnX_greet
'677': zTWhfmnX_mood_great
'678': zTWhfmnX_mood_unhappy
'679': zYNRdXgt_affirm
'680': zYNRdXgt_bot_challenge
'681': zYNRdXgt_deny
'682': zYNRdXgt_goodbye
'683': zYNRdXgt_greet
'684': zYNRdXgt_mood_great
'685': zYNRdXgt_mood_unhappy
'686': zkRpqrMw_affirm
'687': zkRpqrMw_bot_challenge
'688': zkRpqrMw_deny
'689': zkRpqrMw_goodbye
'690': zkRpqrMw_greet
'691': zkRpqrMw_mood_great
'692': zkRpqrMw_mood_unhappy
'693': znXLDhJj_affirm
'694': znXLDhJj_bot_challenge
'695': znXLDhJj_deny
'696': znXLDhJj_goodbye
'697': znXLDhJj_greet
'698': znXLDhJj_mood_great
'699': znXLDhJj_mood_unhappy
splits:
- name: train
num_bytes: 241452
num_examples: 6800
download_size: 93746
dataset_size: 241452
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_never_negator | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1232
num_examples: 6
- name: test
num_bytes: 1276
num_examples: 4
- name: train
num_bytes: 4256
num_examples: 19
download_size: 12767
dataset_size: 6764
---
# Dataset Card for "MULTI_VALUE_wnli_never_negator"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DIAS123/MARCUS | ---
license: openrail
---
|
kye/all-allenai-python | ---
license: mit
---
|
OdiaGenAI/all_combined_bengali_252k | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- bn
pretty_name: all_combined_bengali_252K
size_categories:
- 100K<n<1M
---
# Dataset Card for all_combined_bengali_252K
## Dataset Description
- **Homepage: https://www.odiagenai.org/**
- **Repository: https://github.com/OdiaGenAI**
- **Point of Contact: Shantipriya Parida, and Sambit Sekhar**
### Dataset Summary
This dataset is a mix of Bengali instruction sets translated from open-source instruction sets:
* Dolly,
* Alpaca,
* ChatDoctor,
* Roleplay
* GSM
In this dataset Bengali instruction, input, and output strings are available.
### Supported Tasks and Leaderboards
Large Language Model (LLM)
### Languages
Bengali
## Dataset Structure
JSON
### Data Fields
output (string)
data_source (string)
instruction (string)
input (string)
### Licensing Information
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg
### Citation Information
If you find this repository useful, please consider giving 👏 and citing:
```
@misc{OdiaGenAI,
author = {Shantipriya Parida and Sambit Sekhar and Guneet Singh Kohli and Arghyadeep Sen and Shashikanta Sahoo},
title = {Bengali Instruction Set},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/OdiaGenAI}},
}
```
### Contributions
- Shantipriya Parida
- Sambit Sekhar
- Guneet Singh Kohli
- Arghyadeep Sen
- Shashikanta Sahoo |
Babak-Behkamkia/GPT-3_stance | ---
license: mit
---
|
benayas/atis_nlpaug_20pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 415516
num_examples: 4455
download_size: 177091
dataset_size: 415516
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sodarev/wp-plugin-review | ---
license: gpl-3.0
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T07:06:26.845938](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down/blob/main/results_2023-10-29T07-06-26.845938.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20836828859060402,\n\
\ \"em_stderr\": 0.004159269440162747,\n \"f1\": 0.2507906879194633,\n\
\ \"f1_stderr\": 0.004162090421371717,\n \"acc\": 0.43807672814244847,\n\
\ \"acc_stderr\": 0.01035305451841861\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.20836828859060402,\n \"em_stderr\": 0.004159269440162747,\n\
\ \"f1\": 0.2507906879194633,\n \"f1_stderr\": 0.004162090421371717\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \
\ \"acc_stderr\": 0.008744810131034056\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803166\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T07_06_26.845938
path:
- '**/details_harness|drop|3_2023-10-29T07-06-26.845938.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T07-06-26.845938.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T07_06_26.845938
path:
- '**/details_harness|gsm8k|5_2023-10-29T07-06-26.845938.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T07-06-26.845938.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-01-17.783068.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-01-17.783068.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T07_06_26.845938
path:
- '**/details_harness|winogrande|5_2023-10-29T07-06-26.845938.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T07-06-26.845938.parquet'
- config_name: results
data_files:
- split: 2023_10_10T10_01_17.783068
path:
- results_2023-10-10T10-01-17.783068.parquet
- split: 2023_10_29T07_06_26.845938
path:
- results_2023-10-29T07-06-26.845938.parquet
- split: latest
path:
- results_2023-10-29T07-06-26.845938.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T07:06:26.845938](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-gate_up_down/blob/main/results_2023-10-29T07-06-26.845938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.20836828859060402,
"em_stderr": 0.004159269440162747,
"f1": 0.2507906879194633,
"f1_stderr": 0.004162090421371717,
"acc": 0.43807672814244847,
"acc_stderr": 0.01035305451841861
},
"harness|drop|3": {
"em": 0.20836828859060402,
"em_stderr": 0.004159269440162747,
"f1": 0.2507906879194633,
"f1_stderr": 0.004162090421371717
},
"harness|gsm8k|5": {
"acc": 0.11372251705837756,
"acc_stderr": 0.008744810131034056
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803166
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
aimsks/ts-aims-reefscapes-satellite-segmentation | ---
dataset_info:
features:
- name: image
dtype: binary
- name: segmentation
dtype: binary
- name: class_label
dtype: string
- name: bbox_epsg32754
sequence: float64
splits:
- name: train
num_bytes: 1165209562
num_examples: 1748
- name: test
num_bytes: 657217424
num_examples: 981
- name: validation
num_bytes: 324309904
num_examples: 487
download_size: 1322361933
dataset_size: 2146736890
---
# Dataset Card for "ts-aims-reefscapes-satellite-segmentation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jiahuan/nlg_mix_en_de_it | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: history
sequence:
sequence: string
splits:
- name: train
num_bytes: 8883799
num_examples: 17391
- name: val
num_bytes: 2825880
num_examples: 5670
- name: test
num_bytes: 5666404
num_examples: 11214
download_size: 3709146
dataset_size: 17376083
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
jondurbin/omega-multimodal-ids | ---
license: apache-2.0
---
|
watermelonhydro/es_en_2999 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 1361312
num_examples: 2999
download_size: 770888
dataset_size: 1361312
---
# Dataset Card for "es_en_2999"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713115518 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2308434
num_examples: 7267
download_size: 1313472
dataset_size: 2308434
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_sst2_em_obj_pronoun | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 4356
num_examples: 31
- name: test
num_bytes: 11772
num_examples: 83
- name: train
num_bytes: 204729
num_examples: 1852
download_size: 106321
dataset_size: 220857
---
# Dataset Card for "MULTI_VALUE_sst2_em_obj_pronoun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OgYrKo/sample_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': ukrayina
'1': svit
'2': politika
'3': groshi
splits:
- name: train
num_bytes: 3385259
num_examples: 982
- name: test
num_bytes: 841947
num_examples: 246
download_size: 2092020
dataset_size: 4227206
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_55_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 15856196
num_examples: 24447
download_size: 7999140
dataset_size: 15856196
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_55_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-37000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 995732
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Dahoas/unet-flowers | ---
dataset_info:
features:
- name: images
sequence:
sequence:
sequence: uint8
splits:
- name: train
num_bytes: 26771456
num_examples: 2048
download_size: 25284415
dataset_size: 26771456
---
# Dataset Card for "unet-flowers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
orpo-explorers/OpenHermesPreferences-25k | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidates_completions
sequence: string
- name: candidate_policies
sequence: string
- name: ranks
sequence: int64
- name: rank_str
dtype: string
- name: chosen_policy
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 183505880.9083467
num_examples: 25000
download_size: 91178447
dataset_size: 183505880.9083467
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
udmurtNLP/flores-250-rus-udm | ---
configs:
- config_name: default
data_files:
- split: sentences
path: data/sentences-*
dataset_info:
features:
- name: rus
dtype: string
- name: udm
dtype: string
splits:
- name: sentences
num_bytes: 129728
num_examples: 250
download_size: 72479
dataset_size: 129728
language:
- udm
---
# FLORES-250, Russian and Udmurt sentences
Compared to the original FLORES-250, in the Russian version the sentence
`Вечер начал певец Санджу Шарма, за ним выступил Джай Шанкар Чаудхари. esented the chhappan bhog bhajan также. Ему аккомпанировал певец Раджу Кханделвал`
changed to
`Вечер начал певец Санджу Шарма, за ним выступил Джай Шанкар Чаудхари. Ему аккомпанировал певец Раджу Кханделвал`.
## Usage
```py
from datasets import load_dataset
dataset = load_dataset("udmurtNLP/flores-250-rus-udm")
```
## Citation
```
@inproceedings{yankovskaya-etal-2023-machine,
title = "Machine Translation for Low-resource {F}inno-{U}gric Languages",
author = {Yankovskaya, Lisa and
Tars, Maali and
T{\"a}ttar, Andre and
Fishel, Mark},
booktitle = "Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)",
month = may,
year = "2023",
address = "T{\'o}rshavn, Faroe Islands",
publisher = "University of Tartu Library",
url = "https://aclanthology.org/2023.nodalida-1.77",
pages = "762--771",
abstract = "This paper focuses on neural machine translation (NMT) for low-resource Finno-Ugric languages. Our contributions are three-fold: (1) we extend existing and collect new parallel and monolingual corpora for 20 languages, (2) we expand the 200-language translation benchmark FLORES-200 with manual translations into nine new languages, and (3) we present experiments using the collected data to create NMT systems for the included languages and investigate the impact of back-translation data on the NMT performance for low-resource languages. Experimental results show that carefully selected limited amounts of back-translation directions yield the best results in terms of translation scores, for both high-resource and low-resource output languages.",
}
``` |
CyberHarem/chang_chun_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chang_chun/長春/长春 (Azur Lane)
This is the dataset of chang_chun/長春/长春 (Azur Lane), containing 43 images and their tags.
The core tags of this character are `long_hair, bangs, hair_ornament, animal_ears, red_eyes, parted_bangs, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 49.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chang_chun_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 43 | 28.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chang_chun_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 101 | 61.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chang_chun_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 43 | 44.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chang_chun_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 101 | 85.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chang_chun_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chang_chun_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, animal_hood, blush, hood_up, long_sleeves, sleeves_past_fingers, fur-trimmed_sleeves, sidelocks, white_pantyhose, wide_sleeves, grey_hair, hooded_capelet, :3, fur-trimmed_capelet, open_mouth, blue_coat, tiger_ears, tiger_print, white_background, brown_footwear, fur-trimmed_boots, fur-trimmed_hood, hairclip, simple_background, very_long_hair, :d, armband, closed_mouth, cross-laced_footwear |
| 1 | 8 |  |  |  |  |  | 1girl, armband, looking_at_viewer, solo, white_hair, fake_animal_ears, fur_trim, goggles_on_head, open_mouth, white_pantyhose, black_footwear, boots, coat, white_gloves, wide_sleeves, blush, full_body, headphones, smile, rigging, high_heels, hood, long_sleeves, mechanical_ears, rocket_launcher, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | animal_hood | blush | hood_up | long_sleeves | sleeves_past_fingers | fur-trimmed_sleeves | sidelocks | white_pantyhose | wide_sleeves | grey_hair | hooded_capelet | :3 | fur-trimmed_capelet | open_mouth | blue_coat | tiger_ears | tiger_print | white_background | brown_footwear | fur-trimmed_boots | fur-trimmed_hood | hairclip | simple_background | very_long_hair | :d | armband | closed_mouth | cross-laced_footwear | white_hair | fake_animal_ears | fur_trim | goggles_on_head | black_footwear | boots | coat | white_gloves | full_body | headphones | smile | rigging | high_heels | hood | mechanical_ears | rocket_launcher |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------|:--------|:----------|:---------------|:-----------------------|:----------------------|:------------|:------------------|:---------------|:------------|:-----------------|:-----|:----------------------|:-------------|:------------|:-------------|:--------------|:-------------------|:-----------------|:--------------------|:-------------------|:-----------|:--------------------|:-----------------|:-----|:----------|:---------------|:-----------------------|:-------------|:-------------------|:-----------|:------------------|:-----------------|:--------|:-------|:---------------|:------------|:-------------|:--------|:----------|:-------------|:-------|:------------------|:------------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | X | | X | | | | X | X | | | | | X | | | | X | | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Ubaidbhat/databaseBenchmarkQA | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: source_doc
dtype: string
- name: groundedness_score
dtype: int64
- name: relevance_score
dtype: int64
splits:
- name: train
num_bytes: 513993
num_examples: 263
download_size: 268275
dataset_size: 513993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ZionAv1/Z | ---
license: gemma
---
|
Moritz-Pfeifer/FT_news_classification | ---
license: mit
---
|
Miladsol/Fa-to-En | ---
dataset_info:
features:
- name: fa
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 205131405
num_examples: 1254412
- name: val
num_bytes: 43840074
num_examples: 268803
- name: test
num_bytes: 43840565
num_examples: 268803
download_size: 185949756
dataset_size: 292812044
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
javlonDev/speaking | ---
license: mit
---
|
Multimodal-Fatima/OxfordPets_test_facebook_opt_350m_Visclues_ns_20 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_5_bs_3
num_bytes: 292097.0
num_examples: 20
download_size: 0
dataset_size: 292097.0
---
# Dataset Card for "OxfordPets_test_facebook_opt_350m_Visclues_ns_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andersonbcdefg/lima-ai-filtered | ---
dataset_info:
features:
- name: conversations
sequence: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 1447954
num_examples: 446
download_size: 833674
dataset_size: 1447954
---
# Dataset Card for "lima-ai-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xxl_mode_T_A_D_PNP_NO_FILTER_C_Q_rices_ns_500 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 1246308
num_examples: 500
download_size: 289167
dataset_size: 1246308
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xxl_mode_T_A_D_PNP_NO_FILTER_C_Q_rices_ns_500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/90c8145e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1336
dataset_size: 178
---
# Dataset Card for "90c8145e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jtjt520j/CSpider_for_Qwen | ---
license: apache-2.0
---
|
ConvLab/emowoz | ---
license: apache-2.0
---
|
indicbench/truthfulqa_gu | ---
dataset_info:
- config_name: default
features:
- name: _data_files
list:
- name: filename
dtype: string
- name: _fingerprint
dtype: string
- name: _format_columns
dtype: 'null'
- name: _format_type
dtype: 'null'
- name: _output_all_columns
dtype: bool
- name: _split
dtype: 'null'
splits:
- name: train
num_bytes: 107
num_examples: 2
download_size: 3274
dataset_size: 107
- config_name: generation
features:
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: source
dtype: string
splits:
- name: validation
num_bytes: 1047203
num_examples: 817
download_size: 339966
dataset_size: 1047203
- config_name: multiple_choice
features:
- name: question
dtype: string
- name: mc1_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int64
- name: mc2_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int64
splits:
- name: validation
num_bytes: 1459677
num_examples: 817
download_size: 437522
dataset_size: 1459677
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: generation
data_files:
- split: validation
path: generation/validation-*
- config_name: multiple_choice
data_files:
- split: validation
path: multiple_choice/validation-*
---
|
Seenka/spots_audios | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: int64
- name: brand_id
dtype: int64
- name: brand_name
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: created_at
dtype: timestamp[us, tz=UTC]
- name: confirmed_at
dtype: timestamp[us, tz=UTC]
- name: confirmed_by_id
dtype: int64
- name: clip_url
dtype: string
- name: duration
dtype: float64
- name: thumb_url
dtype: string
- name: clip_duration
dtype: float64
- name: filename
dtype: string
- name: embeddings
sequence:
sequence: float32
splits:
- name: train
num_bytes: 261559300.0
num_examples: 417
download_size: 242934514
dataset_size: 261559300.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "spots_audios"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gianma/camera_cleaned_8192_174 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: is_camera
dtype: bool
- name: reference
dtype: string
- name: summary
dtype: string
- name: tokenized_len_total
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1549422
num_examples: 97
- name: validation
num_bytes: 103831
num_examples: 6
- name: test
num_bytes: 107835
num_examples: 6
download_size: 728059
dataset_size: 1761088
---
# Dataset Card for "camera_cleaned_8192_174"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/vulcan_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of vulcan/ヴァルカン/火神 (Arknights)
This is the dataset of vulcan/ヴァルカン/火神 (Arknights), containing 68 images and their tags.
The core tags of this character are `horns, red_eyes, short_hair, grey_hair, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 68 | 92.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vulcan_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 68 | 79.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vulcan_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 155 | 154.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vulcan_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vulcan_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, black_jacket, holding_weapon, solo, hood, looking_at_viewer, shield, choker, collarbone, long_sleeves, open_jacket, oripathy_lesion_(arknights), black_skirt, asymmetrical_legwear, closed_mouth, hammer, pantyhose, pleated_skirt, thigh_strap |
| 1 | 17 |  |  |  |  |  | 1girl, solo, looking_at_viewer, black_jacket, closed_mouth, simple_background, upper_body, choker, collarbone, hood_up, infection_monitor_(arknights), oripathy_lesion_(arknights), white_background, hooded_jacket, open_jacket, shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | holding_weapon | solo | hood | looking_at_viewer | shield | choker | collarbone | long_sleeves | open_jacket | oripathy_lesion_(arknights) | black_skirt | asymmetrical_legwear | closed_mouth | hammer | pantyhose | pleated_skirt | thigh_strap | simple_background | upper_body | hood_up | infection_monitor_(arknights) | white_background | hooded_jacket | shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------------|:-------|:-------|:--------------------|:---------|:---------|:-------------|:---------------|:--------------|:------------------------------|:--------------|:-----------------------|:---------------|:---------|:------------|:----------------|:--------------|:--------------------|:-------------|:----------|:--------------------------------|:-------------------|:----------------|:--------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | | X | | X | | X | X | | X | X | | | X | | | | | X | X | X | X | X | X | X |
|
oskarvanderwal/bbq | ---
license: cc-by-4.0
language:
- en
tags:
- social bias
pretty_name: BBQ
configs:
- config_name: All
data_files:
- split: test
path: data/All.jsonl
default: true
- config_name: Age
data_files:
- split: test
path: data/Age.jsonl
- config_name: Disability_status
data_files:
- split: test
path: data/Disability_status.jsonl
- config_name: Gender_identity
data_files:
- split: test
path: data/Gender_identity.jsonl
- config_name: Nationality
data_files:
- split: test
path: data/Nationality.jsonl
- config_name: Physical_appearance
data_files:
- split: test
path: data/Physical_appearance.jsonl
- config_name: Race_ethnicity
data_files:
- split: test
path: data/Race_ethnicity.jsonl
- config_name: Race_x_SES
data_files:
- split: test
path: data/Race_x_SES.jsonl
- config_name: Race_x_gender
data_files:
- split: test
path: data/Race_x_gender.jsonl
- config_name: Religion
data_files:
- split: test
path: data/Religion.jsonl
- config_name: SES
data_files:
- split: test
path: data/SES.jsonl
- config_name: Sexual_orientation
data_files:
- split: test
path: data/Sexual_orientation.jsonl
---
# BBQ
Repository for the Bias Benchmark for QA dataset.
https://github.com/nyu-mll/BBQ
Authors: Alicia Parrish, Angelica Chen, Nikita Nangia, Vishakh Padmakumar, Jason Phang, Jana Thompson, Phu Mon Htut, and Samuel R. Bowman.
*This repository is a fork of https://huggingface.co/datasets/heegyu/bbq, and adds the "All" configuration containing all subsets.*
## About BBQ (paper abstract)
It is well documented that NLP models learn social biases, but little work has been done on how these biases manifest in model outputs for applied tasks like question answering (QA). We introduce the Bias Benchmark for QA (BBQ), a dataset of question sets constructed by the authors that highlight attested social biases against people belonging to protected classes along nine social dimensions relevant for U.S. English-speaking contexts. Our task evaluates model responses at two levels: (i) given an under-informative context, we test how strongly responses refect social biases, and (ii) given an adequately informative context, we test whether the model's biases override a correct answer choice. We fnd that models often rely on stereotypes when the context is under-informative, meaning the model's outputs consistently reproduce harmful biases in this setting. Though models are more accurate when the context provides an informative answer, they still rely on stereotypes and average up to 3.4 percentage points higher accuracy when the correct answer aligns with a social bias than when it conficts, with this difference widening to over 5 points on examples targeting gender for most models tested.
## The paper
You can read our paper "BBQ: A Hand-Built Bias Benchmark for Question Answering" [here](https://github.com/nyu-mll/BBQ/blob/main/QA_bias_benchmark.pdf). The paper has been published in the Findings of ACL 2022 [here](https://aclanthology.org/2022.findings-acl.165/).
|
CyberHarem/asta_starrail | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of asta/アスター/艾丝妲/아스타 (Honkai: Star Rail)
This is the dataset of asta/アスター/艾丝妲/아스타 (Honkai: Star Rail), containing 194 images and their tags.
The core tags of this character are `bangs, braid, blue_eyes, hair_ornament, pink_hair, breasts, bow, black_bow, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 194 | 321.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asta_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 194 | 156.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asta_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 460 | 341.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asta_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 194 | 269.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asta_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 460 | 533.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asta_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/asta_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_shirt, bare_shoulders, collared_shirt, sleeveless_shirt, virtual_youtuber, blush, one_side_up, smile, upper_body, white_background, black_bowtie, simple_background, choker, black_skirt, detached_sleeves, frills, hair_between_eyes, medium_breasts, open_mouth |
| 1 | 5 |  |  |  |  |  | 1girl, black_skirt, high_heels, id_card, looking_at_viewer, medium_hair, purple_skirt, smile, solo, white_shirt, bowtie, brown_hair, sleeveless_shirt, thigh_strap, bare_shoulders, black_footwear, detached_sleeves, sitting, full_body |
| 2 | 11 |  |  |  |  |  | 1girl, hetero, 1boy, blush, nipples, solo_focus, sex, completely_nude, penis, sweat, large_breasts, mosaic_censoring, navel, open_mouth, vaginal, collarbone, looking_at_viewer, spread_legs, female_pubic_hair, id_card, medium_breasts, on_back, pov, pussy, short_hair, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_shirt | bare_shoulders | collared_shirt | sleeveless_shirt | virtual_youtuber | blush | one_side_up | smile | upper_body | white_background | black_bowtie | simple_background | choker | black_skirt | detached_sleeves | frills | hair_between_eyes | medium_breasts | open_mouth | high_heels | id_card | medium_hair | purple_skirt | bowtie | brown_hair | thigh_strap | black_footwear | sitting | full_body | hetero | 1boy | nipples | solo_focus | sex | completely_nude | penis | sweat | large_breasts | mosaic_censoring | navel | vaginal | collarbone | spread_legs | female_pubic_hair | on_back | pov | pussy | short_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------|:-----------------|:-----------------|:-------------------|:-------------------|:--------|:--------------|:--------|:-------------|:-------------------|:---------------|:--------------------|:---------|:--------------|:-------------------|:---------|:--------------------|:-----------------|:-------------|:-------------|:----------|:--------------|:---------------|:---------|:-------------|:--------------|:-----------------|:----------|:------------|:---------|:-------|:----------|:-------------|:------|:------------------|:--------|:--------|:----------------|:-------------------|:--------|:----------|:-------------|:--------------|:--------------------|:----------|:------|:--------|:-------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | | | X | | | | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | | | | | | | X | | X | | | | | | | | | | X | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
XenocodeRCE/etre_et_temps | ---
language:
- fr
--- |
Multimodal-Fatima/OxfordPets_test_facebook_opt_2.7b_Attributes_ns_3669 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 121046369.375
num_examples: 3669
- name: fewshot_1_bs_16
num_bytes: 121909353.375
num_examples: 3669
- name: fewshot_3_bs_16
num_bytes: 123709332.375
num_examples: 3669
- name: fewshot_5_bs_16
num_bytes: 125501830.375
num_examples: 3669
- name: fewshot_8_bs_16
num_bytes: 128203042.375
num_examples: 3669
download_size: 602512072
dataset_size: 620369927.875
---
# Dataset Card for "OxfordPets_test_facebook_opt_2.7b_Attributes_ns_3669"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/evol-codealpaca-pairwise-chatml | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 23863820
num_examples: 10553
download_size: 12436195
dataset_size: 23863820
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
estebancrop/pablolobato | ---
license: unknown
---
|
AlShurbaji/labels | ---
license: other
---
|
CS-8321/sd1.5-images | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt_idx
dtype: int64
- name: prompt
dtype: string
- name: interrogation
dtype: string
splits:
- name: train
num_bytes: 1451961382.0
num_examples: 3750
download_size: 1464051529
dataset_size: 1451961382.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.