datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Mongi-BESBES/GHI-ST | ---
license: apache-2.0
---
|
Cafet/whisper_test | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 1818623016.0
num_examples: 1893
download_size: 369696792
dataset_size: 1818623016.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
daniel-gordon/biopharma-dive | ---
license: apache-2.0
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 381329.7472924188
num_examples: 1745
- name: validation
num_bytes: 21197.126353790612
num_examples: 97
- name: test
num_bytes: 21197.126353790612
num_examples: 97
download_size: 253861
dataset_size: 423724.00000000006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
winglian/sn18-all-20240204 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: run_id
dtype: string
- name: step
dtype: int64
- name: uid
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 16095190820
num_examples: 8595545
download_size: 7091700915
dataset_size: 16095190820
---
# Dataset Card for "sn18-all-20240204"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GSQA/speech-alpaca-gpt4-unit | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: speech_input
dtype: string
- name: input_speaker
dtype: string
- name: output_speaker
dtype: string
- name: mhubert_layer11_code1000_input_code
dtype: string
- name: mhubert_layer11_code1000_output_audio
dtype: string
- name: hubert_layer6_code100_input_code
dtype: string
- name: hubert_layer6_code100_output_audio
dtype: string
splits:
- name: train
num_bytes: 1718767489
num_examples: 51349
download_size: 654738368
dataset_size: 1718767489
---
# Dataset Card for "speech-alpaca-gpt4-unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MattGPT/XS-AI_Base | ---
license: cc-by-nc-4.0
---
|
winglian/cortex-dumpster-fire-gemma-v5-crashed | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 9079199058
num_examples: 1573000
download_size: 4832892228
dataset_size: 9079199058
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mozilla-foundation/common_voice_16_0 | ---
pretty_name: Common Voice Corpus 16
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- ab
- af
- am
- ar
- as
- ast
- az
- ba
- bas
- be
- bg
- bn
- br
- ca
- ckb
- cnh
- cs
- cv
- cy
- da
- de
- dv
- dyu
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gl
- gn
- ha
- he
- hi
- hsb
- hu
- hy
- ia
- id
- ig
- is
- it
- ja
- ka
- kab
- kk
- kmr
- ko
- ky
- lg
- lij
- lo
- lt
- ltg
- lv
- mdf
- mhr
- mk
- ml
- mn
- mr
- mrj
- mt
- myv
- nan
- ne
- nhi
- nl
- nn
- oc
- or
- os
- pa
- pl
- ps
- pt
- quy
- rm
- ro
- ru
- rw
- sah
- sat
- sc
- sk
- skr
- sl
- sq
- sr
- sv
- sw
- ta
- te
- th
- ti
- tig
- tk
- tok
- tr
- tt
- tw
- ug
- uk
- ur
- uz
- vi
- vot
- yi
- yo
- yue
- zgh
- zh
language_bcp47:
- zh-CN
- zh-HK
- zh-TW
- sv-SE
- rm-sursilv
- rm-vallader
- pa-IN
- nn-NO
- ne-NP
- nan-tw
- hy-AM
- ga-IE
- fy-NL
license:
- cc0-1.0
multilinguality:
- multilingual
paperswithcode_id: common-voice
extra_gated_prompt: "By clicking on “Access repository” below, you also agree to not attempt to determine the identity of speakers in the Common Voice dataset."
---
# Dataset Card for Common Voice Corpus 16
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Vaibhav Srivastav](mailto:vaibhav@huggingface.co)
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Languages
```
Abkhaz, Afrikaans, Albanian, Amharic, Arabic, Armenian, Assamese, Asturian, Azerbaijani, Basaa, Bashkir, Basque, Belarusian, Bengali, Breton, Bulgarian, Cantonese, Catalan, Central Kurdish, Chinese (China), Chinese (Hong Kong), Chinese (Taiwan), Chuvash, Czech, Danish, Dhivehi, Dioula, Dutch, English, Erzya, Esperanto, Estonian, Finnish, French, Frisian, Galician, Georgian, German, Greek, Guarani, Hakha Chin, Hausa, Hebrew, Hill Mari, Hindi, Hungarian, Icelandic, Igbo, Indonesian, Interlingua, Irish, Italian, Japanese, Kabyle, Kazakh, Kinyarwanda, Korean, Kurmanji Kurdish, Kyrgyz, Lao, Latgalian, Latvian, Ligurian, Lithuanian, Luganda, Macedonian, Malayalam, Maltese, Marathi, Meadow Mari, Moksha, Mongolian, Nepali, Norwegian Nynorsk, Occitan, Odia, Ossetian, Pashto, Persian, Polish, Portuguese, Punjabi, Quechua Chanka, Romanian, Romansh Sursilvan, Romansh Vallader, Russian, Sakha, Santali (Ol Chiki), Saraiki, Sardinian, Serbian, Slovak, Slovenian, Sorbian, Upper, Spanish, Swahili, Swedish, Taiwanese (Minnan), Tamazight, Tamil, Tatar, Telugu, Thai, Tigre, Tigrinya, Toki Pona, Turkish, Turkmen, Twi, Ukrainian, Urdu, Uyghur, Uzbek, Vietnamese, Votic, Welsh, Western Sierra Puebla Nahuatl, Yiddish, Yoruba
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi" for Hindi):
```python
from datasets import load_dataset
cv_16 = load_dataset("mozilla-foundation/common_voice_16_0", "hi", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
cv_16 = load_dataset("mozilla-foundation/common_voice_16_0", "hi", split="train", streaming=True)
print(next(iter(cv_16)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
cv_16 = load_dataset("mozilla-foundation/common_voice_16_0", "hi", split="train")
batch_sampler = BatchSampler(RandomSampler(cv_16), batch_size=32, drop_last=False)
dataloader = DataLoader(cv_16, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
cv_16 = load_dataset("mozilla-foundation/common_voice_16_0", "hi", split="train")
dataloader = DataLoader(cv_16, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 16 with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_16_0", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
|
ro_sts_parallel | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
- ro
license:
- cc-by-4.0
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-sts-b
task_categories:
- translation
task_ids: []
paperswithcode_id: null
pretty_name: RO-STS-Parallel
dataset_info:
- config_name: ro_sts_parallel
features:
- name: translation
dtype:
translation:
languages:
- ro
- en
splits:
- name: train
num_bytes: 1563909
num_examples: 11499
- name: validation
num_bytes: 443787
num_examples: 3001
- name: test
num_bytes: 347590
num_examples: 2759
download_size: 2251694
dataset_size: 2355286
- config_name: rosts-parallel-en-ro
features:
- name: translation
dtype:
translation:
languages:
- en
- ro
splits:
- name: train
num_bytes: 1563909
num_examples: 11499
- name: validation
num_bytes: 443787
num_examples: 3001
- name: test
num_bytes: 347590
num_examples: 2759
download_size: 2251694
dataset_size: 2355286
---
# Dataset Card for RO-STS-Parallel
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [GitHub](https://github.com/dumitrescustefan/RO-STS)
- **Repository:** [GitHub](https://github.com/dumitrescustefan/RO-STS)
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [email](dumitrescu.stefan@gmail.com)
### Dataset Summary
We present RO-STS-Parallel - a Parallel Romanian-English dataset obtained by translating the [STS English dataset](https://ixa2.si.ehu.eus/stswiki/index.php/STSbenchmark) dataset into Romanian. It contains 17256 sentences in Romanian and English.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
The text dataset is in Romanian and English (`ro`, `en`)
## Dataset Structure
### Data Instances
An example looks like this:
```
{
'translation': {
'ro': 'Problema e si mai simpla.',
'en': 'The problem is simpler than that.'
}
}
```
### Data Fields
- translation:
- ro: text in Romanian
- en: text in English
### Data Splits
The train/validation/test split contain 11,498/3,000/2,758 sentence pairs.
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
*To construct the dataset, we first obtained automatic translations using Google's translation engine. These were then manually checked, corrected, and cross-validated by human volunteers. *
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
CC BY-SA 4.0 License
### Citation Information
```
@inproceedings{dumitrescu2021liro,
title={Liro: Benchmark and leaderboard for romanian language tasks},
author={Dumitrescu, Stefan Daniel and Rebeja, Petru and Lorincz, Beata and Gaman, Mihaela and Avram, Andrei and Ilie, Mihai and Pruteanu, Andrei and Stan, Adriana and Rosia, Lorena and Iacobescu, Cristina and others},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 1)},
year={2021}
}
```
### Contributions
Thanks to [@lorinczb](https://github.com/lorinczb) for adding this dataset. |
open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf | ---
pretty_name: Evaluation run of codellama/CodeLlama-34b-Instruct-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T07:31:59.292506](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf/blob/main/results_2023-12-10T07-31-59.292506.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5550583840264827,\n\
\ \"acc_stderr\": 0.03405562001199965,\n \"acc_norm\": 0.5588404318554717,\n\
\ \"acc_norm_stderr\": 0.03476259213185152,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454895,\n \"mc2\": 0.44437538633055657,\n\
\ \"mc2_stderr\": 0.014550940721814704\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5093856655290102,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.5426621160409556,\n \"acc_norm_stderr\": 0.01455810654392406\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5637323242381995,\n\
\ \"acc_stderr\": 0.004949080334816024,\n \"acc_norm\": 0.7691694881497709,\n\
\ \"acc_norm_stderr\": 0.004205030476886528\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464244,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464244\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777472,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777472\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.0307673947078081,\n\
\ \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.0307673947078081\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.03794012674697028,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.03794012674697028\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851102,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851102\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6096774193548387,\n\
\ \"acc_stderr\": 0.027751256636969576,\n \"acc_norm\": 0.6096774193548387,\n\
\ \"acc_norm_stderr\": 0.027751256636969576\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.034304624161038716,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.034304624161038716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713549,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713549\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.03074890536390989,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.03074890536390989\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.02533900301010651,\n \
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.02533900301010651\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7064220183486238,\n \"acc_stderr\": 0.019525151122639667,\n \"\
acc_norm\": 0.7064220183486238,\n \"acc_norm_stderr\": 0.019525151122639667\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\
\ \"acc_stderr\": 0.016267000684598645,\n \"acc_norm\": 0.7075351213282248,\n\
\ \"acc_norm_stderr\": 0.016267000684598645\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172554,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172554\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325963,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325963\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3891786179921773,\n\
\ \"acc_stderr\": 0.012452613934287012,\n \"acc_norm\": 0.3891786179921773,\n\
\ \"acc_norm_stderr\": 0.012452613934287012\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213535,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213535\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6489795918367347,\n\
\ \"acc_stderr\": 0.03055531675557364,\n \"acc_norm\": 0.6489795918367347,\n\
\ \"acc_norm_stderr\": 0.03055531675557364\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.7611940298507462,\n \"acc_stderr\": 0.03014777593540922,\n\
\ \"acc_norm\": 0.7611940298507462,\n \"acc_norm_stderr\": 0.03014777593540922\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n\
\ \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7485380116959064,\n\
\ \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.7485380116959064,\n\
\ \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454895,\n\
\ \"mc2\": 0.44437538633055657,\n \"mc2_stderr\": 0.014550940721814704\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.7458563535911602,\n\
\ \"acc_stderr\": 0.012236307219708267\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.379833206974981,\n \"acc_stderr\": 0.013368818096960495\n\
\ }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|arc:challenge|25_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|arc:challenge|25_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_05T07_42_59.568400
path:
- '**/details_harness|drop|3_2023-11-05T07-42-59.568400.parquet'
- split: 2023_11_07T04_11_54.628433
path:
- '**/details_harness|drop|3_2023-11-07T04-11-54.628433.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-07T04-11-54.628433.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_05T07_42_59.568400
path:
- '**/details_harness|gsm8k|5_2023-11-05T07-42-59.568400.parquet'
- split: 2023_11_07T04_11_54.628433
path:
- '**/details_harness|gsm8k|5_2023-11-07T04-11-54.628433.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|gsm8k|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hellaswag|10_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hellaswag|10_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T07-31-59.292506.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T23:11:14.511248.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T07-31-59.292506.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_05T07_42_59.568400
path:
- '**/details_harness|winogrande|5_2023-11-05T07-42-59.568400.parquet'
- split: 2023_11_07T04_11_54.628433
path:
- '**/details_harness|winogrande|5_2023-11-07T04-11-54.628433.parquet'
- split: 2023_12_10T07_31_59.292506
path:
- '**/details_harness|winogrande|5_2023-12-10T07-31-59.292506.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T07-31-59.292506.parquet'
- config_name: results
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- results_2023-08-25T23:11:14.511248.parquet
- split: 2023_11_05T07_42_59.568400
path:
- results_2023-11-05T07-42-59.568400.parquet
- split: 2023_11_07T04_11_54.628433
path:
- results_2023-11-07T04-11-54.628433.parquet
- split: 2023_12_10T07_31_59.292506
path:
- results_2023-12-10T07-31-59.292506.parquet
- split: latest
path:
- results_2023-12-10T07-31-59.292506.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-34b-Instruct-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T07:31:59.292506](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf/blob/main/results_2023-12-10T07-31-59.292506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5550583840264827,
"acc_stderr": 0.03405562001199965,
"acc_norm": 0.5588404318554717,
"acc_norm_stderr": 0.03476259213185152,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454895,
"mc2": 0.44437538633055657,
"mc2_stderr": 0.014550940721814704
},
"harness|arc:challenge|25": {
"acc": 0.5093856655290102,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5426621160409556,
"acc_norm_stderr": 0.01455810654392406
},
"harness|hellaswag|10": {
"acc": 0.5637323242381995,
"acc_stderr": 0.004949080334816024,
"acc_norm": 0.7691694881497709,
"acc_norm_stderr": 0.004205030476886528
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464244,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464244
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777472,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777472
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49056603773584906,
"acc_stderr": 0.0307673947078081,
"acc_norm": 0.49056603773584906,
"acc_norm_stderr": 0.0307673947078081
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.03794012674697028,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.03794012674697028
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851102,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851102
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6096774193548387,
"acc_stderr": 0.027751256636969576,
"acc_norm": 0.6096774193548387,
"acc_norm_stderr": 0.027751256636969576
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713549,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713549
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.03074890536390989,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.03074890536390989
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.02533900301010651,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.02533900301010651
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5168067226890757,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.5168067226890757,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7064220183486238,
"acc_stderr": 0.019525151122639667,
"acc_norm": 0.7064220183486238,
"acc_norm_stderr": 0.019525151122639667
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459156,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459156
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652265,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7075351213282248,
"acc_stderr": 0.016267000684598645,
"acc_norm": 0.7075351213282248,
"acc_norm_stderr": 0.016267000684598645
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172554,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172554
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325963,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325963
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3891786179921773,
"acc_stderr": 0.012452613934287012,
"acc_norm": 0.3891786179921773,
"acc_norm_stderr": 0.012452613934287012
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213535,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213535
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5,
"acc_stderr": 0.020227834851568375,
"acc_norm": 0.5,
"acc_norm_stderr": 0.020227834851568375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454895,
"mc2": 0.44437538633055657,
"mc2_stderr": 0.014550940721814704
},
"harness|winogrande|5": {
"acc": 0.7458563535911602,
"acc_stderr": 0.012236307219708267
},
"harness|gsm8k|5": {
"acc": 0.379833206974981,
"acc_stderr": 0.013368818096960495
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
financeart/HON_2 | ---
license: mit
---
|
QianT/autotrain-data-auto_train | ---
task_categories:
- translation
---
# AutoTrain Dataset for project: auto_train
## Dataset Description
This dataset has been automatically processed by AutoTrain for project auto_train.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"source": "\u79fb\u5c45\u9999\u6e2f\u5f8c\uff0c\u60a8\u53ef\u4ee5\u524d\u5f80\u6211\u5011\u7684\u5176\u4e2d\u4e00\u5bb6\u5206\u884c\u7533\u8acb\u4fe1\u7528\u5361\u2014\u2014\u5728\u9019\u88e1\u627e\u5230\u60a8\u6700\u65b9\u4fbf\u7684\u5730\u9ede\u3002",
"target": "After you move to Hong Kong you can apply for a Credit Card by visiting one of our branches \u2013 find your most convenient location here."
},
{
"source": "\u79fb\u5c45\u9999\u6e2f\u5f8c\uff0c\u60a8\u53ef\u4ee5\u524d\u5f80\u6211\u5011\u7684\u5176\u4e2d\u4e00\u5bb6\u5206\u884c\u7533\u8acb\u5132\u84c4/\u652f\u7968\u8cec\u6236\u2014\u2014\u5728\u9019\u88e1\u627e\u5230\u60a8\u6700\u65b9\u4fbf\u7684\u5730\u9ede\u3002\u5982\u679c\u60a8\u9858\u610f\uff0c\u6211\u5011\u53ef\u4ee5\u5728\u60a8\u62b5\u9054\u5f8c\u70ba\u60a8\u5b89\u6392\u5728\u60a8\u9078\u64c7\u7684\u5206\u884c\u7684\u9810\u7d04\u8a0e\u8ad6\u60a8\u7684\u9280\u884c\u548c\u8ca1\u5bcc\u7ba1\u7406\u9700\u6c42\u3002\u8981\u5b89\u6392\u7d04\u6703\uff0c\u8acb\u806f\u7e6b\u60a8\u7576\u5730\u7684\u82b1\u65d7\u9280\u884c\u4ee3\u8868\u3002",
"target": "After you move to Hong Kong you can apply for a Savings / Checking Account by visiting one of our branches \u2013 find your most convenient location here.If you wish so, we can schedule an appointment for you in a Branch of your choice upon your arrival to discuss your banking and wealth management needs. To schedule an appointment, contact your local Citibank representative."
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"source": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 332 |
| valid | 83 |
|
leduckhai/VietMed | ---
viewer: false
---
# VietMed: A Dataset and Benchmark for Automatic Speech Recognition of Vietnamese in the Medical Domain
## Description:
We introduced a Vietnamese speech recognition dataset in the medical domain comprising 16h of labeled medical speech, 1000h of unlabeled medical speech and 1200h of unlabeled general-domain speech.
To our best knowledge, VietMed is by far **the world’s largest public medical speech recognition dataset** in 7 aspects:
total duration, number of speakers, diseases, recording conditions, speaker roles, unique medical terms and accents.
VietMed is also by far the largest public Vietnamese speech dataset in terms of total duration.
Additionally, we are the first to present a medical ASR dataset covering all ICD-10 disease groups and all accents within a country.
Please cite this paper: https://arxiv.org/abs/2404.05659
@inproceedings{VietMed_dataset,
title={VietMed: A Dataset and Benchmark for Automatic Speech Recognition of Vietnamese in the Medical Domain},
author={Khai Le-Duc},
year={2024},
booktitle = {Proceedings of the Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)},
}
## Contact:
```
Le Duc Khai
University of Toronto, Canada
Email: duckhai.le@mail.utoronto.ca
GitHub: https://github.com/leduckhai
``` |
Skynet9513/LEON4 | ---
license: openrail
---
|
VityaVitalich/LIMA | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 3868113.3560649264
num_examples: 1018
download_size: 1649254
dataset_size: 3868113.3560649264
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
qazisaad/llama_2_optimized_product_titles-esci | ---
dataset_info:
features:
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 6296294
num_examples: 2199
download_size: 987749
dataset_size: 6296294
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_optimized_product_titles-esci"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
openaccess-ai-collective/d8d39edd06f6930363461e2a189d8b37 | Invalid username or password. |
sxysxy/wat2c_from_bigdata_1w | ---
license: apache-2.0
---
|
mnoukhov/openai_summarize_generated_20k_relabel_1b_margin | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: pred_chosen
dtype: float32
- name: pred_rejected
dtype: float32
splits:
- name: train
num_bytes: 36142323
num_examples: 20000
download_size: 22113730
dataset_size: 36142323
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xOrfe/Match3 | ---
license: cc-by-nc-nd-3.0
---
|
qq2905627706/yuju | ---
license: openrail
---
|
renumics/esc50 | ---
dataset_info:
features:
- name: src_file
dtype: string
- name: fold
dtype: int64
- name: label
dtype:
class_label:
names:
'0': dog
'1': rooster
'2': pig
'3': cow
'4': frog
'5': cat
'6': hen
'7': insects
'8': sheep
'9': crow
'10': rain
'11': sea_waves
'12': crackling_fire
'13': crickets
'14': chirping_birds
'15': water_drops
'16': wind
'17': pouring_water
'18': toilet_flush
'19': thunderstorm
'20': crying_baby
'21': sneezing
'22': clapping
'23': breathing
'24': coughing
'25': footsteps
'26': laughing
'27': brushing_teeth
'28': snoring
'29': drinking_sipping
'30': door_wood_knock
'31': mouse_click
'32': keyboard_typing
'33': door_wood_creaks
'34': can_opening
'35': washing_machine
'36': vacuum_cleaner
'37': clock_alarm
'38': clock_tick
'39': glass_breaking
'40': helicopter
'41': chainsaw
'42': siren
'43': car_horn
'44': engine
'45': train
'46': church_bells
'47': airplane
'48': fireworks
'49': hand_saw
- name: esc10
dtype: bool
- name: take
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 882179256
num_examples: 2000
download_size: 773038488
dataset_size: 882179256
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-nc-2.0
task_categories:
- audio-classification
size_categories:
- 1K<n<10K
---
# Dataset Card for "esc50"
This is a mirror for the ESC-50 dataset. Original sources:
https://github.com/karolpiczak/ESC-50
K. J. Piczak. ESC: Dataset for Environmental Sound Classification. Proceedings of the 23rd Annual ACM Conference on Multimedia, Brisbane, Australia, 2015.
[DOI: http://dx.doi.org/10.1145/2733373.2806390]
The dataset is available under the terms of the Creative Commons Attribution Non-Commercial license.
## Exploring the dataset
You can visualize the dataset using Renumics Spotlight:
```python
import datasets
from renumics import spotlight
ds = datasets.load_dataset('renumics/esc50', split='train')
spotlight.show(ds)
```
## Explore enriched dataset
To fully understand the dataset, you can leverage model results such as embeddings or predictions.
Here is an example how to use zero-shot classification with MS CLAP for this purpose:
```python
ds_results = datasets.load_dataset("renumics/esc50-clap2023-results",split='train')
ds = datasets.concatenate_datasets([ds, ds_results], axis=1)
spotlight.show(ds, dtype={'text_embedding': spotlight.Embedding, 'audio_embedding': spotlight.Embedding})
```

|
stoddur/medication_chat | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 371157528.0
num_examples: 240387
download_size: 14253912
dataset_size: 371157528.0
---
# Dataset Card for "medication_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
silk-road/ChatHaruhi_NovelWriting | ---
license: cc-by-4.0
---
|
BOP-Berlin-University-Alliance/dc_terms_raw_data | ---
license: gpl-3.0
task_categories:
- text-classification
language:
- en
pretty_name: meta data
size_categories:
- n<1K
---
The dataset consists of the descriptions and comments about the concepts in Dublin Core ontology terms. |
JINIAC/ja_law_20240330_prefilter | ---
license: cc-by-4.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1501734982
num_examples: 449041
download_size: 422291061
dataset_size: 1501734982
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vlsp-2023-vllm/comprehension | ---
dataset_info:
features:
- name: question
dtype: string
- name: id
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: answerKey
dtype: string
splits:
- name: test
num_bytes: 2742115
num_examples: 900
download_size: 1261593
dataset_size: 2742115
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "comprehension"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
foundation-models/dummy-detecton | ---
license: lgpl
---
|
Montazer/kafi | ---
dataset_info:
features:
- name: volume
dtype: string
- name: chapter
dtype: string
- name: section
dtype: string
- name: subsection
dtype: string
- name: number
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 18817230
num_examples: 18335
download_size: 6180019
dataset_size: 18817230
---
# Usul al-Kafi Dataset
## Description
The Usul al-Kafi dataset is a digital compilation of the narrations from the renowned Shia Islamic book "Usul al-Kafi." It contains a comprehensive collection of hadiths attributed to the Prophet Muhammad and the Shia Imams, covering various aspects of Islamic teachings, including theology, ethics, jurisprudence, and social issues.
## Usage
This dataset serves as a valuable resource for scholars, researchers, and students interested in studying and understanding Shia Islamic traditions and teachings. It can be utilized for research, academic studies, religious studies, and comparative analysis of hadith literature.
## Content
The dataset is structured in a tabular format with the following columns:
| Column Name | Description |
|-------------|-----------------------------------|
| volume | The volume in which the hadith is printed in (Mujallad) |
| chapter | The book has chapters called 'kitab' |
| section | Each 'kitab' may have some 'bab' |
| subsection | Each 'bab' itself may have some 'bab' again |
| number | In each 'bab' there are some hadiths numbered in the order of presence |
| text | The body of hadith |
## Examples
The following table shows the head of the dataset:
| volume | chapter | section | subsection | number | text |
|:---------|:-----------|:----------------------|:-------------|---------:|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| المجلد 1 | کِتَابُ الْحُجَّهِ | بَابٌ فِی تَسْمِیَهِ مَنْ رَآهُ ع | | 7 | 8- عَلِیٌّ عَنْ أَبِی عَلِیٍّ أَحْمَدَ بْنِ إِبْرَاهِیمَ بْنِ إِدْرِیسَ عَنْ أَبِیهِ أَنَّهُ قَالَ: رَأَیْتُهُ ع بَعْدَ مُضِیِّ أَبِی مُحَمَّدٍ حِینَ أَیْفَعَ وَ قَبَّلْتُ یَدَیْهِ وَ رَأْسَهُ. |
| المجلد 1 | کِتَابُ الْحُجَّهِ | بَابٌ فِی تَسْمِیَهِ مَنْ رَآهُ ع | | 8 | 9- عَلِیٌّ عَنْ أَبِی عَبْدِ اللَّهِ بْنِ صَالِحٍ وَ أَحْمَدَ بْنِ النَّضْرِ عَنِ الْقَنْبَرِیِّ رَجُلٌ مِنْ وُلْدِ قَنْبَرٍ الْکَبِیرِ مَوْلَی أَبِی الْحَسَنِ الرِّضَا ع قَالَ: جَرَی حَدِیثُ جَعْفَرِ بْنِ عَلِیٍّ فَذَمَّهُ فَقُلْتُ لَهُ فَلَیْسَ غَیْرُهُ فَهَلْ رَأَیْتَهُ فَقَالَ لَمْ أَرَهُ وَ لَکِنْ رَآهُ غَیْرِی قُلْتُ وَ مَنْ رَآهُ قَالَ قَدْ رَآهُ جَعْفَرٌ مَرَّتَیْنِ وَ لَهُ حَدِیثٌ. |
| المجلد 1 | کِتَابُ الْحُجَّهِ | بَابٌ فِی تَسْمِیَهِ مَنْ رَآهُ ع | | 9 | 10- عَلِیُّ بْنُ مُحَمَّدٍ عَنْ أَبِی مُحَمَّدٍ الْوَجْنَانِیِّ أَنَّهُ أَخْبَرَنِی عَمَّنْ رَآهُ أَنَّهُ خَرَجَ مِنَ الدَّارِ قَبْلَ الْحَادِثِ بِعَشَرَهِ أَیَّامٍ وَ هُوَ یَقُولُ اللَّهُمَّ إِنَّکَ تَعْلَمُ أَنَّهَا مِنْ أَحَبِّ الْبِقَاعِ لَوْ لَا الطَّرْدُ:" أَوْ کَلَامٌ هَذَا نَحْوُهُ". |
| المجلد 1 | کِتَابُ الْحُجَّهِ | بَابٌ فِی تَسْمِیَهِ مَنْ رَآهُ ع | | 10 | 11- عَلِیُّ بْنُ مُحَمَّدٍ عَنْ عَلِیِّ بْنِ قَیْسٍ عَنْ بَعْضِ جَلَاوِزَهِ السَّوَادِ قَالَ: شَاهَدْتُ سِیمَاءَ (3) آنِفاً بِسُرَّ مَنْ رَأَی وَ قَدْ کَسَرَ بَابَ الدَّارِ فَخَرَجَ عَلَیْهِ وَ بِیَدِهِ طَبَرْزِینٌ فَقَالَ لَهُ-مَا تَصْنَعُ فِی دَارِی فَقَالَ سِیمَاءُ إِنَّ جَعْفَراً زَعَمَ أَنَّ أَبَاکَ مَضَی وَ لَا وَلَدَ لَهُ فَإِنْ کَانَتْ دَارَکَ فَقَدِ انْصَرَفْتُ عَنْکَ فَخَرَجَ عَنِ الدَّارِ قَالَ- عَلِیُّ بْنُ قَیْسٍ فَخَرَجَ عَلَیْنَا خَادِمٌ مِنْ خَدَمِ الدَّارِ فَسَأَلْتُهُ عَنْ هَذَا الْخَبَرِ فَقَالَ لِی مَنْ حَدَّثَکَ بِهَذَا فَقُلْتُ لَهُ حَدَّثَنِی بَعْضُ جَلَاوِزَهِ السَّوَادِ فَقَالَ لِی لَا یَکَادُ یَخْفَی عَلَی النَّاسِ شَیْ ءٌ. |
| المجلد 1 | کِتَابُ الْحُجَّهِ | بَابٌ فِی تَسْمِیَهِ مَنْ رَآهُ ع | | 11 | 12- عَلِیُّ بْنُ مُحَمَّدٍ عَنْ جَعْفَرِ بْنِ مُحَمَّدٍ الْکُوفِیِّ عَنْ جَعْفَرِ بْنِ مُحَمَّدٍ الْمَکْفُوفِ عَنْ عَمْرٍو الْأَهْوَازِیِّ قَالَ: أَرَانِیهِ أَبُو مُحَمَّدٍ ع وَ قَالَ هَذَا صَاحِبُکُمْ. (1) |
## Source
The Usul al-Kafi dataset is the result of the effort by [Ghaemiyeh Computer Research Institute of Isfahan](https://www.ghbook.ir/index.php?lang=en). We are grateful for their contribution in making this dataset available.
## License
The Usul al-Kafi dataset is released under the Creative Commons Attribution-NonCommercial-ShareAlike (CC BY-NC-SA) license. This license allows users to use, adapt, and distribute the dataset for non-commercial purposes, provided they attribute the dataset to its original source, share any derivative works under the same license, and refrain from generating income from the dataset's use. |
open-llm-leaderboard/details_google__gemma-2b-it | ---
pretty_name: Evaluation run of google/gemma-2b-it
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [google/gemma-2b-it](https://huggingface.co/google/gemma-2b-it) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_google__gemma-2b-it\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T15:21:47.760131](https://huggingface.co/datasets/open-llm-leaderboard/details_google__gemma-2b-it/blob/main/results_2024-02-22T15-21-47.760131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37718990524507395,\n\
\ \"acc_stderr\": 0.03382258665978276,\n \"acc_norm\": 0.3817941191023956,\n\
\ \"acc_norm_stderr\": 0.03463067200858019,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.45823771048471756,\n\
\ \"mc2_stderr\": 0.01592882772091717\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4044368600682594,\n \"acc_stderr\": 0.014342036483436172,\n\
\ \"acc_norm\": 0.439419795221843,\n \"acc_norm_stderr\": 0.01450374782358013\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48078072097191793,\n\
\ \"acc_stderr\": 0.004986093791041649,\n \"acc_norm\": 0.6269667396932882,\n\
\ \"acc_norm_stderr\": 0.004826224784850447\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.030402331445769537,\n\
\ \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.030402331445769537\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.03962135573486219,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.03962135573486219\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.031245325202761923,\n\
\ \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.031245325202761923\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707546,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707546\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.46060606060606063,\n \"acc_stderr\": 0.03892207016552012,\n\
\ \"acc_norm\": 0.46060606060606063,\n \"acc_norm_stderr\": 0.03892207016552012\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4595959595959596,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.4595959595959596,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442207,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442207\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3282051282051282,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.3282051282051282,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2,\n \"acc_stderr\": 0.024388430433987664,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.024388430433987664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.3403361344537815,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.3403361344537815,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5100917431192661,\n \"acc_stderr\": 0.021432956203453316,\n \"\
acc_norm\": 0.5100917431192661,\n \"acc_norm_stderr\": 0.021432956203453316\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.027467401804057986,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.027467401804057986\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4264705882352941,\n \"acc_stderr\": 0.03471157907953426,\n \"\
acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.03471157907953426\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5189873417721519,\n \"acc_stderr\": 0.03252375148090448,\n \
\ \"acc_norm\": 0.5189873417721519,\n \"acc_norm_stderr\": 0.03252375148090448\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.42748091603053434,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.42748091603053434,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3619631901840491,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.3619631901840491,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.44660194174757284,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.44660194174757284,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n\
\ \"acc_stderr\": 0.03217180182641086,\n \"acc_norm\": 0.594017094017094,\n\
\ \"acc_norm_stderr\": 0.03217180182641086\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4623243933588761,\n\
\ \"acc_stderr\": 0.017829131764287187,\n \"acc_norm\": 0.4623243933588761,\n\
\ \"acc_norm_stderr\": 0.017829131764287187\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.407514450867052,\n \"acc_stderr\": 0.0264545781469315,\n\
\ \"acc_norm\": 0.407514450867052,\n \"acc_norm_stderr\": 0.0264545781469315\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468636,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468636\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.02849199358617157,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.02849199358617157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40192926045016075,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.40192926045016075,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.027339546640662727,\n\
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.027339546640662727\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022135,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022135\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31877444589308995,\n\
\ \"acc_stderr\": 0.0119018956357861,\n \"acc_norm\": 0.31877444589308995,\n\
\ \"acc_norm_stderr\": 0.0119018956357861\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.38235294117647056,\n \"acc_stderr\": 0.019659922493623336,\n \
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.019659922493623336\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.0472457740573157,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.0472457740573157\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43781094527363185,\n\
\ \"acc_stderr\": 0.035080801121998406,\n \"acc_norm\": 0.43781094527363185,\n\
\ \"acc_norm_stderr\": 0.035080801121998406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.038057975055904594,\n\
\ \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.038057975055904594\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.45823771048471756,\n\
\ \"mc2_stderr\": 0.01592882772091717\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6093133385951065,\n \"acc_stderr\": 0.01371253603655665\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05458680818802123,\n \
\ \"acc_stderr\": 0.006257444037912521\n }\n}\n```"
repo_url: https://huggingface.co/google/gemma-2b-it
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|arc:challenge|25_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|gsm8k|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hellaswag|10_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-21-47.760131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T15-21-47.760131.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- '**/details_harness|winogrande|5_2024-02-22T15-21-47.760131.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T15-21-47.760131.parquet'
- config_name: results
data_files:
- split: 2024_02_22T15_21_47.760131
path:
- results_2024-02-22T15-21-47.760131.parquet
- split: latest
path:
- results_2024-02-22T15-21-47.760131.parquet
---
# Dataset Card for Evaluation run of google/gemma-2b-it
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [google/gemma-2b-it](https://huggingface.co/google/gemma-2b-it) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_google__gemma-2b-it",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T15:21:47.760131](https://huggingface.co/datasets/open-llm-leaderboard/details_google__gemma-2b-it/blob/main/results_2024-02-22T15-21-47.760131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37718990524507395,
"acc_stderr": 0.03382258665978276,
"acc_norm": 0.3817941191023956,
"acc_norm_stderr": 0.03463067200858019,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.45823771048471756,
"mc2_stderr": 0.01592882772091717
},
"harness|arc:challenge|25": {
"acc": 0.4044368600682594,
"acc_stderr": 0.014342036483436172,
"acc_norm": 0.439419795221843,
"acc_norm_stderr": 0.01450374782358013
},
"harness|hellaswag|10": {
"acc": 0.48078072097191793,
"acc_stderr": 0.004986093791041649,
"acc_norm": 0.6269667396932882,
"acc_norm_stderr": 0.004826224784850447
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.03962135573486219,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.03962135573486219
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.35319148936170214,
"acc_stderr": 0.031245325202761923,
"acc_norm": 0.35319148936170214,
"acc_norm_stderr": 0.031245325202761923
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707546,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707546
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.46060606060606063,
"acc_stderr": 0.03892207016552012,
"acc_norm": 0.46060606060606063,
"acc_norm_stderr": 0.03892207016552012
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4595959595959596,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.4595959595959596,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442207,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442207
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3282051282051282,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.3282051282051282,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.024388430433987664,
"acc_norm": 0.2,
"acc_norm_stderr": 0.024388430433987664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3403361344537815,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.3403361344537815,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5100917431192661,
"acc_stderr": 0.021432956203453316,
"acc_norm": 0.5100917431192661,
"acc_norm_stderr": 0.021432956203453316
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.027467401804057986,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.027467401804057986
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.03471157907953426,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.03471157907953426
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5189873417721519,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.5189873417721519,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.42748091603053434,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.42748091603053434,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3619631901840491,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.3619631901840491,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.44660194174757284,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.44660194174757284,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.594017094017094,
"acc_stderr": 0.03217180182641086,
"acc_norm": 0.594017094017094,
"acc_norm_stderr": 0.03217180182641086
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4623243933588761,
"acc_stderr": 0.017829131764287187,
"acc_norm": 0.4623243933588761,
"acc_norm_stderr": 0.017829131764287187
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.407514450867052,
"acc_stderr": 0.0264545781469315,
"acc_norm": 0.407514450867052,
"acc_norm_stderr": 0.0264545781469315
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468636,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468636
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.02849199358617157,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.02849199358617157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40192926045016075,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.40192926045016075,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.027339546640662727,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.027339546640662727
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022135,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022135
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31877444589308995,
"acc_stderr": 0.0119018956357861,
"acc_norm": 0.31877444589308995,
"acc_norm_stderr": 0.0119018956357861
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.019659922493623336,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.019659922493623336
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.0472457740573157,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.0472457740573157
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.47346938775510206,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.47346938775510206,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43781094527363185,
"acc_stderr": 0.035080801121998406,
"acc_norm": 0.43781094527363185,
"acc_norm_stderr": 0.035080801121998406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.038057975055904594,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.038057975055904594
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.45823771048471756,
"mc2_stderr": 0.01592882772091717
},
"harness|winogrande|5": {
"acc": 0.6093133385951065,
"acc_stderr": 0.01371253603655665
},
"harness|gsm8k|5": {
"acc": 0.05458680818802123,
"acc_stderr": 0.006257444037912521
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yaya-sy/rp_test_tiny_llama | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: corpus
dtype: string
splits:
- name: train
num_bytes: 33669688.235242225
num_examples: 16324
download_size: 15127813
dataset_size: 33669688.235242225
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nguyenvy/cleaned_nhanes_1988_2018 | ---
license: cc-by-4.0
---
Description: The National Health and Nutrition Examination Survey (NHANES) provides data and have considerable potential to study the health and environmental exposure of the non-institutionalized US population. However, as NHANES data are plagued with multiple inconsistencies, processing these data is required before deriving new insights through large-scale analyses. Thus, we developed a set of curated and unified datasets by merging 614 separate files and harmonizing unrestricted data across NHANES III (1988-1994) and Continuous (1999-2018), totaling 135,310 participants and 5,078 variables. The variables convey
1. demographics (281 variables),
2. dietary consumption (324 variables),
3. physiological functions (1,027 variables),
4. occupation (61 variables),
5. questionnaires (1444 variables, e.g., physical activity, medical conditions, diabetes, reproductive health, blood pressure and cholesterol, early childhood),
6. medications (29 variables),
7. mortality information linked from the National Death Index (15 variables),
8. survey weights (857 variables),
9. environmental exposure biomarker measurements (598 variables), and
10. chemical comments indicating which measurements are below or above the lower limit of detection (505 variables).
csv Data Record: The curated NHANES datasets and the data dictionaries includes 23 .csv files and 1 excel file.
- The curated NHANES datasets involves 20 .csv formatted files, two for each module with one as the uncleaned version and the other as the cleaned version. The modules are labeled as the following: 1) mortality, 2) dietary, 3) demographics, 4) response, 5) medications, 6) questionnaire, 7) chemicals, 8) occupation, 9) weights, and 10) comments.
- "dictionary\_nhanes.csv" is a dictionary that lists the variable name, description, module, category, units, CAS Number, comment use, chemical family, chemical family shortened, number of measurements, and cycles available for all 5,078 variables in NHANES.
- "dictionary\_harmonized\_categories.csv" contains the harmonized categories for the categorical variables.
- “dictionary\_drug\_codes.csv” contains the dictionary for descriptors on the drugs codes.
- “nhanes\_inconsistencies\_documentation.xlsx” is an excel file that contains the cleaning documentation, which records all the inconsistencies for all affected variables to help curate each of the NHANES modules.
R Data Record: For researchers who want to conduct their analysis in the R programming language, only cleaned NHANES modules and the data dictionaries can be downloaded as a .zip file which include an .RData file and an .R file.
- “w - nhanes_1988\_2018.RData” contains all the aforementioned datasets as R data objects. We make available all R scripts on customized functions that were written to curate the data.
- “m - nhanes\_1988\_2018.R” shows how we used the customized functions (i.e. our pipeline) to curate the original NHANES data.
Example starter codes: The set of starter code to help users conduct exposome analysis consists of four R markdown files (.Rmd). We recommend going through the tutorials in order.
- “example\_0 - merge\_datasets\_together.Rmd” demonstrates how to merge the curated NHANES datasets together.
- “example\_1 - account\_for\_nhanes_design.Rmd” demonstrates how to conduct a linear regression model, a survey-weighted regression model, a Cox proportional hazard model, and a survey-weighted Cox proportional hazard model.
- “example\_2 - calculate\_summary\_statistics.Rmd” demonstrates how to calculate summary statistics for one variable and multiple variables with and without accounting for the NHANES sampling design.
- “example\_3 - run\_multiple\_regressions.Rmd” demonstrates how run multiple regression models with and without adjusting for the sampling design. |
judith0/Classification_INE | ---
license: c-uda
---
|
benayas/banking_artificial_5pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1074637
num_examples: 10003
download_size: 315841
dataset_size: 1074637
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TheGreatRambler/mm2_user_first_cleared | ---
language:
- multilingual
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- other
- object-detection
- text-retrieval
- token-classification
- text-generation
task_ids: []
pretty_name: Mario Maker 2 user first clears
tags:
- text-mining
---
# Mario Maker 2 user first clears
Part of the [Mario Maker 2 Dataset Collection](https://tgrcode.com/posts/mario_maker_2_datasets)
## Dataset Description
The Mario Maker 2 user first clears dataset consists of 17.8 million first clears from Nintendo's online service totaling around 157MB of data. The dataset was created using the self-hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api) over the course of 1 month in February 2022.
### How to use it
The Mario Maker 2 user first clears dataset is a very large dataset so for most use cases it is recommended to make use of the streaming API of `datasets`. You can load and iterate through the dataset with the following code:
```python
from datasets import load_dataset
ds = load_dataset("TheGreatRambler/mm2_user_first_cleared", streaming=True, split="train")
print(next(iter(ds)))
#OUTPUT:
{
'pid': '14510618610706594411',
'data_id': 25199891
}
```
Each row is a unique first clear in the level denoted by the `data_id` done by the player denoted by the `pid`.
You can also download the full dataset. Note that this will download ~157MB:
```python
ds = load_dataset("TheGreatRambler/mm2_user_first_cleared", split="train")
```
## Data Structure
### Data Instances
```python
{
'pid': '14510618610706594411',
'data_id': 25199891
}
```
### Data Fields
|Field|Type|Description|
|---|---|---|
|pid|string|The player ID of this user, an unsigned 64 bit integer as a string|
|data_id|int|The data ID of the level this user first cleared|
### Data Splits
The dataset only contains a train split.
<!-- TODO create detailed statistics -->
## Dataset Creation
The dataset was created over a little more than a month in Febuary 2022 using the self hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api). As requests made to Nintendo's servers require authentication the process had to be done with upmost care and limiting download speed as to not overload the API and risk a ban. There are no intentions to create an updated release of this dataset.
## Considerations for Using the Data
The dataset contains no harmful language or depictions.
|
NghiemAbe/Gold-Triplet | ---
dataset_info:
features:
- name: query
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
- name: distance
dtype: float32
splits:
- name: train
num_bytes: 136045423
num_examples: 344543
download_size: 79902955
dataset_size: 136045423
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-reward-model-deberta-v3-large-v2-re-preference-256-nsample-16 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
splits:
- name: preference
num_bytes: 57932342
num_examples: 20001
download_size: 26460861
dataset_size: 57932342
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: preference
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/preference-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_99_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4228056
num_examples: 4675
download_size: 1687387
dataset_size: 4228056
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_99_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hrenwac_para | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
- hr
license:
- cc-by-sa-3.0
multilinguality:
- translation
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: null
pretty_name: HrenwacPara
dataset_info:
features:
- name: translation
dtype:
translation:
languages:
- en
- hr
config_name: hrenWaC
splits:
- name: train
num_bytes: 29602110
num_examples: 99001
download_size: 11640281
dataset_size: 29602110
---
# Dataset Card for hrenwac_para
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://nlp.ffzg.hr/resources/corpora/hrenwac/
- **Repository:** http://nlp.ffzg.hr/data/corpora/hrenwac/hrenwac.en-hr.txt.gz
- **Paper:** http://workshop2013.iwslt.org/downloads/IWSLT-2013-Cettolo.pdf
- **Leaderboard:**
- **Point of Contact:** [Nikola Ljubešič](mailto:nikola.ljubesic@ffzg.hr)
### Dataset Summary
The hrenWaC corpus version 2.0 consists of parallel Croatian-English texts crawled from the .hr top-level domain for Croatia. The corpus was built with Spidextor (https://github.com/abumatran/spidextor), a tool that glues together the output of SpiderLing used for crawling and Bitextor used for bitext extraction. The accuracy of the extracted bitext on the segment level is around 80% and on the word level around 84%.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Dataset is bilingual with Croatian and English languages.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Dataset is under the [CC-BY-SA 3.0](http://creativecommons.org/licenses/by-sa/3.0/) license.
### Citation Information
```
@misc{11356/1058,
title = {Croatian-English parallel corpus {hrenWaC} 2.0},
author = {Ljube{\v s}i{\'c}, Nikola and Espl{\`a}-Gomis, Miquel and Ortiz Rojas, Sergio and Klubi{\v c}ka, Filip and Toral, Antonio},
url = {http://hdl.handle.net/11356/1058},
note = {Slovenian language resource repository {CLARIN}.{SI}},
copyright = {{CLARIN}.{SI} User Licence for Internet Corpora},
year = {2016} }
```
### Contributions
Thanks to [@IvanZidov](https://github.com/IvanZidov) for adding this dataset. |
iamkaikai/BASQUIAT-ART | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 15156684.0
num_examples: 228
download_size: 15097017
dataset_size: 15156684.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "BASQUIAT-ART"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aditijha/instruct_v3_5k_and_lima | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 22803070
num_examples: 6000
download_size: 13069762
dataset_size: 22803070
---
# Dataset Card for "instruct_v3_5k_and_lima"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/revised-responses-filtered-sampled-lmgym | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: input_text
dtype: string
- name: origianl_response
dtype: string
- name: edited_response
dtype: string
- name: user_id
dtype: string
- name: revised_response
dtype: string
- name: bot_id
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 134201167
num_examples: 53540
download_size: 0
dataset_size: 134201167
---
# Dataset Card for "revised-responses-filtered-sampled-lmgym"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manishiitg/truthful_qa | ---
dataset_info:
features:
- name: question
dtype: string
- name: mc1_targets
struct:
- name: choices
sequence: string
- name: mc2_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int64
splits:
- name: validation
num_bytes: 59921
num_examples: 66
download_size: 25637
dataset_size: 59921
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
Daniel-P-Gonzalez/CCOpenBooks | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- en
- es
- pl
pretty_name: CC OpenBooks
---
# Dataset Card for CC OpenBooks
## Dataset Description
CC OpenBooks is a curated collection of high quality non-fiction books. All texts are from CC-By-4.0 sources, with no license ambiguity.
The documents are normalized to markdown, and care is taken to ensure most formatting (e.g. inline LaTeX) remains intact. Files are manually inspected and cleaned of all defects wherever possible.
### Source Data
The following [Openstax](https://github.com/openstax) collections were used in creating this dataset:
- Introduction to Anthropology
- College Success Concise
- College Success
- Preparing for College Success
- Microbiology
- Chemistry 2e
- Chemistry: Atoms First 2e
- Física universitaria volumen 1
- Física universitaria volumen 2
- Física universitaria volumen 3
- Introduction to Business
- Astronomy 2e
- Principles of Marketing
- Psychologia
- Contemporary Mathematics
- Statistics
- World History Volume 1, to 1500
- World History Volume 2, from 1400
- Physics
- Introduction to Political Science
- Introducción a la estadística empresarial
- Introducción a la estadística
- Entrepreneurship
- Fizyka dla szkół wyższych. Tom 1
- Fizyka dla szkół wyższych. Tom 2
- Fizyka dla szkół wyższych. Tom 3
- Writing Guide with Handbook
- Biology 2e
- Biology for AP® Courses
- Concepts of Biology
- Introduction to Sociology 3e
- Life, Liberty, and the Pursuit of Happiness
- Precálculo 2ed
- Psychology 2e
- Playground
- University Physics Volume 1
- University Physics Volume 2
- University Physics Volume 3
- Principles of Finance
- U.S. History
- American Government 3e
- Anatomy and Physiology 2e
- Química 2ed
- Química: Comenzando con los átomos 2ed
- Elementary Algebra 2e
- Intermediate Algebra 2e
- Prealgebra 2e
- Business Ethics
- Organizational Behavior
- Principles of Management
- Introduction to Intellectual Property
- Principles of Economics 3e
- Principles of Macroeconomics 3e
- Principles of Macroeconomics for AP® Courses 2e
- Algebra and Trigonometry 2e
- College Algebra 2e
- College Algebra with Corequisite Support 2e
- Precalculus 2e
- Introduction to Philosophy
- College Physics 2e
- College Physics for AP® Courses 2e
- Mikroekonomia – Podstawy
Books from other sources:
- [Byte of Python](https://github.com/swaroopch/byte-of-python)
- [Non-Programmer's Tutorial for Python 3](https://en.wikibooks.org/wiki/Non-Programmer%27s_Tutorial_for_Python_3)
- [Python Programming](https://en.wikibooks.org/wiki/Python_Programming)
- [Algorithms](https://en.wikibooks.org/wiki/Algorithms)
- [Communication Theory](https://en.wikibooks.org/wiki/Communication_Theory)
- [C Programming](https://en.wikibooks.org/wiki/C_Programming)
- [C Sharp Programming](https://en.wikibooks.org/wiki/C_Sharp_Programming)
- [Formal Logic](https://en.wikibooks.org/wiki/Formal_Logic)
- [Haskell](https://en.wikibooks.org/wiki/Haskell)
- [How To Assemble A Desktop PC](https://en.wikibooks.org/wiki/How_To_Assemble_A_Desktop_PC)
- [LaTeX](https://en.wikibooks.org/wiki/LaTeX)
- [OpenSSH](https://en.wikibooks.org/wiki/OpenSSH)
- [Write Yourself a Scheme in 48 Hours](https://en.wikibooks.org/wiki/Write_Yourself_a_Scheme_in_48_Hours)
- [X86 Disassembly](https://en.wikibooks.org/wiki/X86_Disassembly)
- [XML - Managing Data Exchange](https://en.wikibooks.org/wiki/XML_-_Managing_Data_Exchange)
- [Bourne Shell Scripting](https://en.wikibooks.org/wiki/Bourne_Shell_Scripting)
- [F Sharp Programming](https://en.wikibooks.org/wiki/F_Sharp_Programming)
- [Tcl Programming](https://en.wikibooks.org/wiki/Tcl_Programming)
- [Java Programming](https://en.wikibooks.org/wiki/Java_Programming)
- [MATLAB Programming](https://en.wikibooks.org/wiki/MATLAB_Programming)
- [MySQL](https://en.wikibooks.org/wiki/MySQL)
- [Foundations of Computer_Science](https://en.wikibooks.org/wiki/Foundations_of_Computer_Science)
- [Introduction to Numerical Methods](https://en.wikibooks.org/wiki/Introduction_to_Numerical_Methods)
- [Think Python](https://en.wikibooks.org/wiki/Think_Python)
- [Engineering Acoustics](https://en.wikibooks.org/wiki/Engineering_Acoustics)
- [Control Systems](https://en.wikibooks.org/wiki/Control_Systems)
- [Sensory Systems](https://en.wikibooks.org/wiki/Sensory_Systems)
- [Transportation Economics](https://en.wikibooks.org/wiki/Transportation_Economics)
- [Circuit Theory](https://en.wikibooks.org/wiki/Circuit_Theory)
- [Communication Systems](https://en.wikibooks.org/wiki/Communication_Systems)
- [Spanish](https://en.wikibooks.org/wiki/Spanish/Contents)
- [Latin](https://en.wikibooks.org/wiki/Latin)
- [English in Use](https://en.wikibooks.org/wiki/English_in_Use)
- [French](https://en.wikibooks.org/wiki/French)
- [German](https://en.wikibooks.org/wiki/German)
- [High School Mathematics Extensions](https://en.wikibooks.org/wiki/High_School_Mathematics_Extensions)
- [Linear Algebra](https://en.wikibooks.org/wiki/Linear_Algebra)
- [Timeless Theorems of Mathematics](https://en.wikibooks.org/wiki/Timeless_Theorems_of_Mathematics)
- [A Brief Introduction to Engineering Computation with MATLAB](https://collection.bccampus.ca/textbooks/a-brief-introduction-to-engineering-computation-with-matlab/)
- [Aerodynamics and Aircraft Performance, 3rd edition](https://vtechworks.lib.vt.edu/handle/10919/96525)
- [Acoustics](https://en.wikibooks.org/wiki/Acoustics)
- [Ada_Programming](https://en.wikibooks.org/wiki/Ada_Programming)
- [Algorithms](https://en.wikibooks.org/wiki/Algorithms)
- [Anatomy_and_Physiology_of_Animals](https://en.wikibooks.org/wiki/Anatomy_and_Physiology_of_Animals)
- [Applications_of_ICT_in_Libraries](https://en.wikibooks.org/wiki/Applications_of_ICT_in_Libraries)
- [Arimaa](https://en.wikibooks.org/wiki/Arimaa)
- [A-level_Computing/AQA](https://en.wikibooks.org/wiki/A-level_Computing/AQA)
- [Basic_Physics_of_Nuclear_Medicine](https://en.wikibooks.org/wiki/Basic_Physics_of_Nuclear_Medicine)
- [Blended_Learning_in_K-12](https://en.wikibooks.org/wiki/Blended_Learning_in_K-12)
- [Blender_3D:_Noob_to_Pro](https://en.wikibooks.org/wiki/Blender_3D:_Noob_to_Pro)
- [C_Programming](https://en.wikibooks.org/wiki/C_Programming)
- [Chess](https://en.wikibooks.org/wiki/Chess)
- [Coaching_Youth_Middle_Distance_Runners](https://en.wikibooks.org/wiki/Coaching_Youth_Middle_Distance_Runners)
- [Cognitive_Psychology_and_Cognitive_Neuroscience](https://en.wikibooks.org/wiki/Cognitive_Psychology_and_Cognitive_Neuroscience)
- [Consciousness_Studies](https://en.wikibooks.org/wiki/Consciousness_Studies)
- [Elements_of_Political_Communication](https://en.wikibooks.org/wiki/Elements_of_Political_Communication)
- [Engineering_Acoustics](https://en.wikibooks.org/wiki/Engineering_Acoustics)
- [European_History](https://en.wikibooks.org/wiki/European_History)
- [First_Aid](https://en.wikibooks.org/wiki/First_Aid)
- [Formal_Logic](https://en.wikibooks.org/wiki/Formal_Logic)
- [Fundamentals_of_Transportation](https://en.wikibooks.org/wiki/Fundamentals_of_Transportation)
- [Guitar](https://en.wikibooks.org/wiki/Guitar)
- [High_School_Mathematics_Extensions](https://en.wikibooks.org/wiki/High_School_Mathematics_Extensions)
- [Historical_Geology](https://en.wikibooks.org/wiki/Historical_Geology)
- [How_To_Assemble_A_Desktop_PC](https://en.wikibooks.org/wiki/How_To_Assemble_A_Desktop_PC)
- [Human_Physiology](https://en.wikibooks.org/wiki/Human_Physiology)
- [Introduction_to_Paleoanthropology](https://en.wikibooks.org/wiki/Introduction_to_Paleoanthropology)
- [Introduction_to_Sociology](https://en.wikibooks.org/wiki/Introduction_to_Sociology)
- [Knowing_Knoppix](https://en.wikibooks.org/wiki/Knowing_Knoppix)
- [Learning_Theories](https://en.wikibooks.org/wiki/Learning_Theories)
- [Linear_Algebra](https://en.wikibooks.org/wiki/Linear_Algebra)
- [Lucid_Dreaming](https://en.wikibooks.org/wiki/Lucid_Dreaming)
- [Managing_Groups_and_Teams](https://en.wikibooks.org/wiki/Managing_Groups_and_Teams)
- [Miskito](https://en.wikibooks.org/wiki/Miskito)
- [Muggles%27_Guide_to_Harry_Potter](https://en.wikibooks.org/wiki/Muggles%27_Guide_to_Harry_Potter)
- [New_Zealand_History](https://en.wikibooks.org/wiki/New_Zealand_History)
- [Physics_Study_Guide](https://en.wikibooks.org/wiki/Physics_Study_Guide)
- [Proteomics](https://en.wikibooks.org/wiki/Proteomics)
- [Radiation_Oncology](https://en.wikibooks.org/wiki/Radiation_Oncology)
- [Social_and_Cultural_Foundations_of_American_Education](https://en.wikibooks.org/wiki/Social_and_Cultural_Foundations_of_American_Education)
- [Special_Relativity](https://en.wikibooks.org/wiki/Special_Relativity)
- [Speech-Language_Pathology/Stuttering](https://en.wikibooks.org/wiki/Speech-Language_Pathology/Stuttering)
- [This_Quantum_World](https://en.wikibooks.org/wiki/This_Quantum_World)
- [UK_Constitution_and_Government](https://en.wikibooks.org/wiki/UK_Constitution_and_Government)
- [UNDP-APDIP_Books](https://en.wikibooks.org/wiki/UNDP-APDIP_Books)
- [Using_Wikibooks](https://en.wikibooks.org/wiki/Using_Wikibooks)
- [Wikijunior:Solar_System](https://en.wikibooks.org/wiki/Wikijunior:Solar_System)
- [XForms](https://en.wikibooks.org/wiki/XForms)
- [Zine_Making](https://en.wikibooks.org/wiki/Zine_Making)
- [Basic_Computing_Using_Windows](https://en.wikibooks.org/wiki/Basic_Computing_Using_Windows)
- [Cognitive_Psychology_and_Cognitive_Neuroscience](https://en.wikibooks.org/wiki/Cognitive_Psychology_and_Cognitive_Neuroscience)
- [Movie_Making_Manual](https://en.wikibooks.org/wiki/Movie_Making_Manual)
- [Organic_Chemistry](https://en.wikibooks.org/wiki/Organic_Chemistry)
- [European_History](https://en.wikibooks.org/wiki/European_History)
- [Cookbook](https://en.wikibooks.org/wiki/Cookbook)
- [Chess](https://en.wikibooks.org/wiki/Chess)
- [Japanese](https://en.wikibooks.org/wiki/Japanese)
- [Consciousness_Studies](https://en.wikibooks.org/wiki/Consciousness_Studies)
- [Chinese_(Mandarin)](https://en.wikibooks.org/wiki/Chinese_(Mandarin))
- [Wikijunior:Solar_System](https://en.wikibooks.org/wiki/Wikijunior:Solar_System)
- [Blender_3D:_Noob_to_Pro](https://en.wikibooks.org/wiki/Blender_3D:_Noob_to_Pro)
- [FHSST_Physics](https://en.wikibooks.org/wiki/FHSST_Physics)
- [How_To_Assemble_A_Desktop_PC](https://en.wikibooks.org/wiki/How_To_Assemble_A_Desktop_PC)
- [History_of_the_United_States](https://en.wikibooks.org/wiki/History_of_the_United_States)
- [High_School_Mathematics_Extensions](https://en.wikibooks.org/wiki/High_School_Mathematics_Extensions)
- [Lucid_Dreaming](https://en.wikibooks.org/wiki/Lucid_Dreaming)
- [Nanotechnology](https://en.wikibooks.org/wiki/Nanotechnology)
- [Introduction to Online Convex Optimization](https://arxiv.org/abs/1909.05207)
- [Structure and Interpretation of Computer Programs](https://github.com/sarabander/sicp-pdf)
- [Convex Optimization: Algorithms and Complexity](https://arxiv.org/abs/1405.4980)
- [Trustworthy Machine Learning](https://arxiv.org/abs/2310.08215)
#### Initial Data Collection and Normalization
Wherever possible, the books are converted to markdown. This formatting is kept intact with downstream tasks in mind (e.g. conversational QA).
The source of the text is prepended to each document to add context, and it is hoped that this also has the potential to improve source attribution and guidance capabilities of models.
### Licensing Information
All books in this collection were previously released with an unambiguous cc-by-4.0 license by the original authors. |
Antreas/Cityscapes | ---
dataset_info:
features:
- name: image
dtype: image
- name: semantic_segmentation
dtype: image
splits:
- name: train
num_bytes: 7068783630.625
num_examples: 2975
- name: val
num_bytes: 1202393069.0
num_examples: 500
- name: test
num_bytes: 3507968336.775
num_examples: 1525
download_size: 11773867029
dataset_size: 11779145036.4
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
kanishka/counterfactual-babylm-aanns_indef_non_num_removal | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581831219
num_examples: 11633278
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 421601983
dataset_size: 637951449
---
# Dataset Card for "counterfactual-babylm-aanns_indef_non_num_removal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jerryjalapeno/stories_shuffled | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_rte_proximal_distal_demonstratives | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 144054
num_examples: 298
- name: train
num_bytes: 140356
num_examples: 279
download_size: 194080
dataset_size: 284410
---
# Dataset Card for "MULTI_VALUE_rte_proximal_distal_demonstratives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperb/SpokenTermDetection_LJSpeech | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: text
dtype: string
- name: instruction
dtype: string
- name: label
dtype: string
- name: transcription
dtype: string
splits:
- name: test
num_bytes: 58037024.610687025
num_examples: 200
download_size: 57031943
dataset_size: 58037024.610687025
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "spokenTermDetection_LJSpeech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nitinbhayana/processed_demo | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1729780.7113998258
num_examples: 11360
- name: test
num_bytes: 192316.28860017427
num_examples: 1263
download_size: 1149085
dataset_size: 1922097.0
---
# Dataset Card for "processed_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/VQAv2_modif-Sample | Invalid username or password. |
devesh5/codeconv-fortran-to-rust | ---
license: apache-2.0
task_categories:
- translation
tags:
- code
size_categories:
- n<1K
--- |
CVasNLPExperiments/VQAv2_minival_validation_google_flan_t5_xxl_mode_D_PNP_GENERIC_Q_rices_ns_25994 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 325999922
num_examples: 25994
download_size: 17855048
dataset_size: 325999922
---
# Dataset Card for "VQAv2_minival_validation_google_flan_t5_xxl_mode_D_PNP_GENERIC_Q_rices_ns_25994"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cnut1648/ScienceQA-LLAVA | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: choice
dtype: string
- name: answer
dtype: string
- name: lecture
dtype: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 425066440.932
num_examples: 12726
- name: validation
num_bytes: 141104381.824
num_examples: 4241
- name: test
num_bytes: 139230285.176
num_examples: 4241
download_size: 681887955
dataset_size: 705401107.932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "ScienceQA-LLAVA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thaweewat/databricks-dolly-15k-th | ---
license: cc-by-sa-3.0
task_categories:
- question-answering
- summarization
tags:
- instruction-finetuning
language:
- th
size_categories:
- 10K<n<100K
---
# Summary
This is a Thai 🇹🇭-instructed dataset translated from `databricks-dolly-15k` using Google Cloud Translation.
`databricks-dolly-15k` is an open-source dataset of instruction-following records generated by thousands of Databricks employees in several behavioral
categories outlined in the InstructGPT paper, including brainstorming, classification, closed QA, generation, information extraction, open QA, and summarization.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://creativecommons.org/licenses/by-sa/3.0/legalcode).
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Thai
Version: 1.0
---
|
suolyer/pile_stackexchange | ---
license: apache-2.0
---
|
thanhdath/vietnamese-retrieval | ---
dataset_info:
features:
- name: query_id
dtype: string
- name: query
dtype: string
- name: positive_passages
list:
- name: docid
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: negative_passages
list:
- name: docid
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 2745183922
num_examples: 273386
download_size: 927038024
dataset_size: 2745183922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vietnamese-retrieval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bitLixiang/classification_of_cover_quilts | ---
license: apache-2.0
---
|
VirtualRoyalty/QC | ---
license: mit
task_categories:
- text-classification
language:
- en
size_categories:
- 1K<n<10K
pretty_name: uiuc-qc
---
# Question Classification dataset
**Fixed version** (added some examples to test in order to have the same labels in train and test)
This data collection contains all the data used in our learning question classification experiments(see [1]), which has question class definitions, the training and testing question sets, examples of preprocessing the questions, feature definition scripts and examples of semantically related word features. This work has been done by Xin Li and Dan Roth
Source: https://cogcomp.seas.upenn.edu/Data/QA/QC/ |
open-llm-leaderboard/details_seyf1elislam__WestKunai-XD-7b | ---
pretty_name: Evaluation run of seyf1elislam/WestKunai-XD-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [seyf1elislam/WestKunai-XD-7b](https://huggingface.co/seyf1elislam/WestKunai-XD-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_seyf1elislam__WestKunai-XD-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-16T03:28:08.318903](https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-XD-7b/blob/main/results_2024-03-16T03-28-08.318903.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6516064023620174,\n\
\ \"acc_stderr\": 0.032057150248618674,\n \"acc_norm\": 0.6518899154824054,\n\
\ \"acc_norm_stderr\": 0.0327138411493384,\n \"mc1\": 0.5067319461444308,\n\
\ \"mc1_stderr\": 0.017501914492655396,\n \"mc2\": 0.6729359137263011,\n\
\ \"mc2_stderr\": 0.01505540312085654\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068077,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266127\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7033459470225055,\n\
\ \"acc_stderr\": 0.004558491550673697,\n \"acc_norm\": 0.8759211312487553,\n\
\ \"acc_norm_stderr\": 0.0032899775233939097\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.02749566368372406,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.02749566368372406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633507,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633507\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188703,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188703\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083136,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083136\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5067319461444308,\n\
\ \"mc1_stderr\": 0.017501914492655396,\n \"mc2\": 0.6729359137263011,\n\
\ \"mc2_stderr\": 0.01505540312085654\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359238\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.66565579984837,\n \
\ \"acc_stderr\": 0.012994634003332752\n }\n}\n```"
repo_url: https://huggingface.co/seyf1elislam/WestKunai-XD-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|arc:challenge|25_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|gsm8k|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hellaswag|10_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T03-28-08.318903.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T03-28-08.318903.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- '**/details_harness|winogrande|5_2024-03-16T03-28-08.318903.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-16T03-28-08.318903.parquet'
- config_name: results
data_files:
- split: 2024_03_16T03_28_08.318903
path:
- results_2024-03-16T03-28-08.318903.parquet
- split: latest
path:
- results_2024-03-16T03-28-08.318903.parquet
---
# Dataset Card for Evaluation run of seyf1elislam/WestKunai-XD-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [seyf1elislam/WestKunai-XD-7b](https://huggingface.co/seyf1elislam/WestKunai-XD-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_seyf1elislam__WestKunai-XD-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-16T03:28:08.318903](https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-XD-7b/blob/main/results_2024-03-16T03-28-08.318903.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6516064023620174,
"acc_stderr": 0.032057150248618674,
"acc_norm": 0.6518899154824054,
"acc_norm_stderr": 0.0327138411493384,
"mc1": 0.5067319461444308,
"mc1_stderr": 0.017501914492655396,
"mc2": 0.6729359137263011,
"mc2_stderr": 0.01505540312085654
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068077,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266127
},
"harness|hellaswag|10": {
"acc": 0.7033459470225055,
"acc_stderr": 0.004558491550673697,
"acc_norm": 0.8759211312487553,
"acc_norm_stderr": 0.0032899775233939097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.02749566368372406,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.02749566368372406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188703,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188703
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083136,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083136
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5067319461444308,
"mc1_stderr": 0.017501914492655396,
"mc2": 0.6729359137263011,
"mc2_stderr": 0.01505540312085654
},
"harness|winogrande|5": {
"acc": 0.8224151539068666,
"acc_stderr": 0.010740676861359238
},
"harness|gsm8k|5": {
"acc": 0.66565579984837,
"acc_stderr": 0.012994634003332752
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bogeumkim/sw_hackathon_dataset | ---
license: mit
---
|
ccdv/arxiv-summarization | ---
language:
- en
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
task_categories:
- summarization
- text-generation
task_ids: []
tags:
- conditional-text-generation
train-eval-index:
- config: document
task: summarization
task_id: summarization
splits:
eval_split: test
col_mapping:
article: text
abstract: target
---
# Arxiv dataset for summarization
Dataset for summarization of long documents.\
Adapted from this [repo](https://github.com/armancohan/long-summarization).\
Note that original data are pre-tokenized so this dataset returns " ".join(text) and add "\n" for paragraphs. \
This dataset is compatible with the [`run_summarization.py`](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization) script from Transformers if you add this line to the `summarization_name_mapping` variable:
```python
"ccdv/arxiv-summarization": ("article", "abstract")
```
### Data Fields
- `id`: paper id
- `article`: a string containing the body of the paper
- `abstract`: a string containing the abstract of the paper
### Data Splits
This dataset has 3 splits: _train_, _validation_, and _test_. \
Token counts are white space based.
| Dataset Split | Number of Instances | Avg. tokens |
| ------------- | --------------------|:----------------------|
| Train | 203,037 | 6038 / 299 |
| Validation | 6,436 | 5894 / 172 |
| Test | 6,440 | 5905 / 174 |
# Cite original article
```
@inproceedings{cohan-etal-2018-discourse,
title = "A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents",
author = "Cohan, Arman and
Dernoncourt, Franck and
Kim, Doo Soon and
Bui, Trung and
Kim, Seokhwan and
Chang, Walter and
Goharian, Nazli",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2097",
doi = "10.18653/v1/N18-2097",
pages = "615--621",
abstract = "Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models.",
}
```
|
samitizerxu/kelp_data_rgb_swin_nir | ---
dataset_info:
features:
- name: pixel_values
dtype:
array3_d:
shape:
- 350
- 350
- 5
dtype: uint8
- name: label
dtype: image
splits:
- name: train
num_bytes: 6223524957.625
num_examples: 5635
- name: test
num_bytes: 1574172987.75
num_examples: 1426
download_size: 2925302863
dataset_size: 7797697945.375
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/michiru_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of michiru/千鳥ミチル/满 (Blue Archive)
This is the dataset of michiru/千鳥ミチル/满 (Blue Archive), containing 171 images and their tags.
The core tags of this character are `animal_ears, grey_hair, long_hair, raccoon_ears, raccoon_girl, halo, twintails, yellow_eyes, hair_between_eyes, multicolored_hair, tail, raccoon_tail, hair_ornament, ahoge, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 171 | 237.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michiru_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 171 | 200.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michiru_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 422 | 411.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michiru_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/michiru_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, ninja, simple_background, solo, white_background, black_scarf, bridal_gauntlets, pleated_skirt, sleeveless, blue_skirt, eyeshadow, smile, serafuku, gradient_hair, open_mouth, black_hair, japanese_clothes, kuji-in, black_pantyhose, detached_sleeves, pink_neckerchief |
| 1 | 6 |  |  |  |  |  | 1girl, black_pantyhose, black_scarf, looking_at_viewer, neckerchief, ninja, pleated_skirt, serafuku, sleeveless, smile, solo, blue_skirt, eyeshadow, open_mouth, pump_action, shotgun, simple_background, white_background, bridal_gauntlets, full_body, holding_gun, ribbon, bandaged_leg, japanese_clothes, shoes, standing_on_one_leg |
| 2 | 13 |  |  |  |  |  | 1girl, black_pantyhose, black_scarf, ninja, solo, looking_at_viewer, open_mouth, pleated_skirt, simple_background, white_background, blush, serafuku, bandaged_leg, blue_skirt, smile, white_shirt, bridal_gauntlets, pink_neckerchief, weapon |
| 3 | 5 |  |  |  |  |  | blush, penis, solo_focus, 1boy, 1girl, bar_censor, hetero, ninja, nipples, open_mouth, black_pantyhose, black_scarf, bridal_gauntlets, cum_in_pussy, torn_pantyhose, white_background, clothed_sex, heart, navel, on_back, sex_from_behind, simple_background, small_breasts, smile, spread_legs, trembling, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | ninja | simple_background | solo | white_background | black_scarf | bridal_gauntlets | pleated_skirt | sleeveless | blue_skirt | eyeshadow | smile | serafuku | gradient_hair | open_mouth | black_hair | japanese_clothes | kuji-in | black_pantyhose | detached_sleeves | pink_neckerchief | neckerchief | pump_action | shotgun | full_body | holding_gun | ribbon | bandaged_leg | shoes | standing_on_one_leg | blush | white_shirt | weapon | penis | solo_focus | 1boy | bar_censor | hetero | nipples | cum_in_pussy | torn_pantyhose | clothed_sex | heart | navel | on_back | sex_from_behind | small_breasts | spread_legs | trembling | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:--------------------|:-------|:-------------------|:--------------|:-------------------|:----------------|:-------------|:-------------|:------------|:--------|:-----------|:----------------|:-------------|:-------------|:-------------------|:----------|:------------------|:-------------------|:-------------------|:--------------|:--------------|:----------|:------------|:--------------|:---------|:---------------|:--------|:----------------------|:--------|:--------------|:---------|:--------|:-------------|:-------|:-------------|:---------|:----------|:---------------|:-----------------|:--------------|:--------|:--------|:----------|:------------------|:----------------|:--------------|:------------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | X | X | | X | | | | X | | X | | | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | X | | X | X | X | | | | | X | | | X | | | | X | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
davidgaofc/d_RM_inout | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 510916
num_examples: 1820
download_size: 206617
dataset_size: 510916
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gimmaru/piqa | ---
dataset_info:
features:
- name: goal
dtype: string
- name: sol1
dtype: string
- name: sol2
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: validation
num_bytes: 262619
num_examples: 1000
download_size: 0
dataset_size: 262619
---
# Dataset Card for "piqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Note: This dataset was utilized for the evaluation of probability-based prompt selection techniques in the paper '[Improving Probability-based Prompt Selection Through Unified Evaluation and Analysis](https://arxiv.org/abs/2305.14877)'. It differs from the actual benchmark dataset. |
open-llm-leaderboard/details_dball__zephyr-7b-dpo-qlora-no-sft | ---
pretty_name: Evaluation run of dball/zephyr-7b-dpo-qlora-no-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dball/zephyr-7b-dpo-qlora-no-sft](https://huggingface.co/dball/zephyr-7b-dpo-qlora-no-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dball__zephyr-7b-dpo-qlora-no-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T10:37:17.220493](https://huggingface.co/datasets/open-llm-leaderboard/details_dball__zephyr-7b-dpo-qlora-no-sft/blob/main/results_2024-02-10T10-37-17.220493.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6382246766778686,\n\
\ \"acc_stderr\": 0.032245070292894334,\n \"acc_norm\": 0.6434025834441682,\n\
\ \"acc_norm_stderr\": 0.032890809766205786,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.44247071835148866,\n\
\ \"mc2_stderr\": 0.014495116448864753\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6455885281816371,\n\
\ \"acc_stderr\": 0.00477357009618505,\n \"acc_norm\": 0.8449512049392551,\n\
\ \"acc_norm_stderr\": 0.0036121146706989743\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520193,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520193\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"\
acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091826,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091826\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684805,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077816,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077816\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381392,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381392\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32849162011173183,\n\
\ \"acc_stderr\": 0.015707935398496454,\n \"acc_norm\": 0.32849162011173183,\n\
\ \"acc_norm_stderr\": 0.015707935398496454\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4498044328552803,\n\
\ \"acc_stderr\": 0.012705721498565107,\n \"acc_norm\": 0.4498044328552803,\n\
\ \"acc_norm_stderr\": 0.012705721498565107\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.02888819310398863,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.02888819310398863\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360375,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360375\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.44247071835148866,\n\
\ \"mc2_stderr\": 0.014495116448864753\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4162244124336619,\n \
\ \"acc_stderr\": 0.013577788334652672\n }\n}\n```"
repo_url: https://huggingface.co/dball/zephyr-7b-dpo-qlora-no-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|arc:challenge|25_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|gsm8k|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hellaswag|10_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T10-37-17.220493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T10-37-17.220493.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- '**/details_harness|winogrande|5_2024-02-10T10-37-17.220493.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T10-37-17.220493.parquet'
- config_name: results
data_files:
- split: 2024_02_10T10_37_17.220493
path:
- results_2024-02-10T10-37-17.220493.parquet
- split: latest
path:
- results_2024-02-10T10-37-17.220493.parquet
---
# Dataset Card for Evaluation run of dball/zephyr-7b-dpo-qlora-no-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dball/zephyr-7b-dpo-qlora-no-sft](https://huggingface.co/dball/zephyr-7b-dpo-qlora-no-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dball__zephyr-7b-dpo-qlora-no-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T10:37:17.220493](https://huggingface.co/datasets/open-llm-leaderboard/details_dball__zephyr-7b-dpo-qlora-no-sft/blob/main/results_2024-02-10T10-37-17.220493.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6382246766778686,
"acc_stderr": 0.032245070292894334,
"acc_norm": 0.6434025834441682,
"acc_norm_stderr": 0.032890809766205786,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.44247071835148866,
"mc2_stderr": 0.014495116448864753
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6455885281816371,
"acc_stderr": 0.00477357009618505,
"acc_norm": 0.8449512049392551,
"acc_norm_stderr": 0.0036121146706989743
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520193,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520193
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091826,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091826
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684805,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077816,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077816
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381392,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381392
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32849162011173183,
"acc_stderr": 0.015707935398496454,
"acc_norm": 0.32849162011173183,
"acc_norm_stderr": 0.015707935398496454
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4498044328552803,
"acc_stderr": 0.012705721498565107,
"acc_norm": 0.4498044328552803,
"acc_norm_stderr": 0.012705721498565107
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.02888819310398863,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.02888819310398863
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360375,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078685,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078685
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.44247071835148866,
"mc2_stderr": 0.014495116448864753
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.4162244124336619,
"acc_stderr": 0.013577788334652672
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ChristophSchuhmann/Chess-Selfplay2 | ---
license: apache-2.0
---
|
thehamkercat/telegram-spam-ham | ---
license: wtfpl
---
|
Hack90/ncbi_genbank_part_77 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 29897565069
num_examples: 1177983
download_size: 13158660518
dataset_size: 29897565069
---
# Dataset Card for "ncbi_genbank_part_77"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Helsinki-NLP/tatoeba_mt | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- af
- ar
- az
- be
- bg
- bn
- br
- bs
- ca
- ch
- cs
- cv
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fo
- fr
- fy
- ga
- gd
- gl
- gn
- he
- hi
- hr
- hu
- hy
- ia
- id
- ie
- io
- is
- it
- ja
- jv
- ka
- kk
- km
- ko
- ku
- kw
- la
- lb
- lt
- lv
- mi
- mk
- ml
- mn
- mr
- ms
- mt
- my
- nb
- nl
- nn
- 'no'
- oc
- pl
- pt
- qu
- rn
- ro
- ru
- sh
- sl
- sq
- sr
- sv
- sw
- ta
- te
- th
- tk
- tl
- tr
- tt
- ug
- uk
- ur
- uz
- vi
- vo
- yi
- zh
license:
- cc-by-2.0
multilinguality:
- translation
pretty_name: The Tatoeba Translation Challenge
size_categories:
- unknown
source_datasets:
- original
task_categories:
- conditional-text-generation
task_ids:
- machine-translation
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/Helsinki-NLP/Tatoeba-Challenge/
- **Repository:** https://github.com/Helsinki-NLP/Tatoeba-Challenge/
- **Paper:** [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/)
- **Leaderboard:**
- **Point of Contact:** [Jörg Tiedemann](mailto:jorg.tiedemann@helsinki.fi)
### Dataset Summary
The Tatoeba Translation Challenge is a multilingual data set of machine translation benchmarks derived from user-contributed translations collected by [Tatoeba.org](https://tatoeba.org/) and provided as parallel corpus from [OPUS](https://opus.nlpl.eu/). This dataset includes test and development data sorted by language pair. It includes test sets for hundreds of language pairs and is continuously updated. Please, check the version number tag to refer to the release that your are using.
### Supported Tasks and Leaderboards
The translation task is described in detail in the [Tatoeba-Challenge repository](https://github.com/Helsinki-NLP/Tatoeba-Challenge) and covers various sub-tasks with different data coverage and resources. [Training data](https://github.com/Helsinki-NLP/Tatoeba-Challenge/blob/master/data/README.md) is also available from the same repository and [results](https://github.com/Helsinki-NLP/Tatoeba-Challenge/blob/master/results/tatoeba-results-all.md) are published and collected as well. [Models](https://github.com/Helsinki-NLP/Tatoeba-Challenge/blob/master/results/tatoeba-models-all.md) are also released for public use and are also partially available from the [huggingface model hub](https://huggingface.co/Helsinki-NLP).
### Languages
The data set covers hundreds of languages and language pairs and are organized by ISO-639-3 languages. The current release covers the following language: Afrikaans, Arabic, Azerbaijani, Belarusian, Bulgarian, Bengali, Breton, Bosnian, Catalan, Chamorro, Czech, Chuvash, Welsh, Danish, German, Modern Greek, English, Esperanto, Spanish, Estonian, Basque, Persian, Finnish, Faroese, French, Western Frisian, Irish, Scottish Gaelic, Galician, Guarani, Hebrew, Hindi, Croatian, Hungarian, Armenian, Interlingua, Indonesian, Interlingue, Ido, Icelandic, Italian, Japanese, Javanese, Georgian, Kazakh, Khmer, Korean, Kurdish, Cornish, Latin, Luxembourgish, Lithuanian, Latvian, Maori, Macedonian, Malayalam, Mongolian, Marathi, Malay, Maltese, Burmese, Norwegian Bokmål, Dutch, Norwegian Nynorsk, Norwegian, Occitan, Polish, Portuguese, Quechua, Rundi, Romanian, Russian, Serbo-Croatian, Slovenian, Albanian, Serbian, Swedish, Swahili, Tamil, Telugu, Thai, Turkmen, Tagalog, Turkish, Tatar, Uighur, Ukrainian, Urdu, Uzbek, Vietnamese, Volapük, Yiddish, Chinese
## Dataset Structure
### Data Instances
Data instances are given as translation units in TAB-separated files with four columns: source and target language ISO-639-3 codes, source language text and target language text. Note that we do not imply a translation direction and consider the data set to be symmetric and to be used as a test set in both directions. Language-pair-specific subsets are only provided under the label of one direction using sorted ISO-639-3 language IDs.
Some subsets contain several sub-languages or language variants. They may refer to macro-languages such as Serbo-Croatian languages that are covered by the ISO code `hbs`. Language variants may also include different writing systems and in that case the ISO15924 script codes are attached to the language codes. Here are a few examples from the English to Serbo-Croation test set including examples for Bosnian, Croatian and Serbian in Cyrillic and in Latin scripts:
```
eng bos_Latn Children are the flowers of our lives. Djeca su cvijeće našeg života.
eng hrv A bird was flying high up in the sky. Ptica je visoko letjela nebom.
eng srp_Cyrl A bird in the hand is worth two in the bush. Боље врабац у руци, него голуб на грани.
eng srp_Latn Canada is the motherland of ice hockey. Kanada je zemlja-majka hokeja na ledu.
```
There are also data sets with sentence pairs in the same language. In most cases, those are variants with minor spelling differences but they also include rephrased sentences. Here are a few examples from the English test set:
```
eng eng All of us got into the car. We all got in the car.
eng eng All of us hope that doesn't happen. All of us hope that that doesn't happen.
eng eng All the seats are booked. The seats are all sold out.
```
### Data Splits
Test and development data sets are disjoint with respect to sentence pairs but may include overlaps in individual source or target language sentences. Development data should not be used in training directly. The goal of the data splits is to create test sets of reasonable size with a large language coverage. Test sets include at most 10,000 instances. Development data do not exist for all language pairs.
To be comparable with other results, models should use the training data distributed from the [Tatoeba MT Challenge Repository](https://github.com/Helsinki-NLP/Tatoeba-Challenge/) including monolingual data sets also listed there.
## Dataset Creation
### Curation Rationale
The Tatoeba MT data set will be updated continuously and the data preparation procedures are also public and released on [github](https://github.com/Helsinki-NLP/Tatoeba-Challenge/). High language coverage is the main goal of the project and data sets are prepared to be consistent and systematic with standardized language labels and distribution formats.
### Source Data
#### Initial Data Collection and Normalization
The Tatoeba data sets are collected from user-contributed translations submitted to [Tatoeba.org](https://tatoeba.org/) and compiled into a multi-parallel corpus in [OPUS](https://opus.nlpl.eu/Tatoeba.php). The test and development sets are incrementally updated with new releases of the Tatoeba data collection at OPUS. New releases extend the existing data sets. Test sets should not overlap with any of the released development data sets.
#### Who are the source language producers?
The data sets come from [Tatoeba.org](https://tatoeba.org/), which provides a large database of sentences and their translations into a wide varity of languages. Its content is constantly growing as a result of voluntary contributions of thousands of users.
The original project was founded by Trang Ho in 2006, hosted on Sourceforge under the codename of multilangdict.
### Annotations
#### Annotation process
Sentences are translated by volunteers and the Tatoeba database also provides additional metadata about each record including user ratings etc. However, the metadata is currently not used in any way for the compilation of the MT benchmark. Language skills of contributors naturally vary quite a bit and not all translations are done by native speakers of the target language. More information about the contributions can be found at [Tatoeba.org](https://tatoeba.org/).
#### Who are the annotators?
### Personal and Sensitive Information
For information about handling personal and sensitive information we refer to the [original provider](https://tatoeba.org/) of the data. This data set has not been processed in any way to detect or remove potentially sensitve or personal information.
## Considerations for Using the Data
### Social Impact of Dataset
The language coverage is high and with that it represents a highly valuable resource for machine translation development especially for lesser resourced languages and language pairs. The constantly growing database also represents a dynamic resource and its value wil grow further.
### Discussion of Biases
The original source lives from its contributors and there interest and background will to certain subjective and cultural biases. Language coverage and translation quality is also biased by the skills of the contributors.
### Other Known Limitations
The sentences are typically quite short and, therefore, rather easy to translate. For high-resource languages, this leads to results that will be less useful than more challenging benchmarks. For lesser resource language pairs, the limited complexity of the examples is actually a good thing to measure progress even in very challenging setups.
## Additional Information
### Dataset Curators
The data set is curated by the University of Helsinki and its [language technology research group](https://blogs.helsinki.fi/language-technology/). Data and tools used for creating and using the resource are [open source](https://github.com/Helsinki-NLP/Tatoeba-Challenge/) and will be maintained as part of the [OPUS ecosystem](https://opus.nlpl.eu/) for parallel data and machine translation research.
### Licensing Information
The data sets are distributed under the same licence agreement as the original Tatoeba database using a
[CC-BY 2.0 license](https://creativecommons.org/licenses/by/2.0/fr/). More information about the terms of use of the original data sets is listed [here](https://tatoeba.org/eng/terms_of_use).
### Citation Information
If you use the data sets then, please, cite the following paper: [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/)
```
@inproceedings{tiedemann-2020-tatoeba,
title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
author = {Tiedemann, J{\"o}rg},
booktitle = "Proceedings of the Fifth Conference on Machine Translation",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.wmt-1.139",
pages = "1174--1182",
}
```
### Contributions
Thanks to [@jorgtied](https://github.com/jorgtied) and [@Helsinki-NLP](https://github.com/Helsinki-NLP) for adding this dataset.
Thanks also to [CSC Finland](https://www.csc.fi/en/solutions-for-research) for providing computational resources and storage space for the work on OPUS and other MT projects.
|
one-sec-cv12/chunk_246 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 15612986592.75
num_examples: 162554
download_size: 14032898971
dataset_size: 15612986592.75
---
# Dataset Card for "chunk_246"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Erynan/gpt_util_10 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: more_reasonable
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3173
num_examples: 10
download_size: 6151
dataset_size: 3173
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hugginglearners/russia-ukraine-conflict-articles | ---
license:
- cc-by-nc-sa-4.0
kaggle_id: hskhawaja/russia-ukraine-conflict
---
# Dataset Card for Russia Ukraine Conflict
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://kaggle.com/datasets/hskhawaja/russia-ukraine-conflict
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
###Context
On 24 February 2022, Russia invaded Ukraine in a major escalation of the Russo-Ukrainian War that began in 2014. The invasion caused Europe's largest refugee crisis since World War II, with more than 6.3 million Ukrainians fleeing the country and a third of the population displaced (*Source: Wikipedia*).
###Content
This dataset is a collection of 407 news articles from NYT and Guardians related to ongoing conflict between Russia and Ukraine. The publishing date of articles ranges from Feb 1st, 2022 to Jul 31st, 2022.
###What you can do?
Here are some ideas to explore:
- Discourse analysis of Russia-Ukraine conflict (How the war has evolved over months?)
- Identify most talked about issues (refugees, food, weapons, fuel, etc.)
- Extract sentiment of articles for both Russia and Ukraine
- Which world leaders have tried to become mediators?
- Number of supporting countries for both Russia and Ukraine
- Map how NATO alliance has been affected by the war
I am looking forward to see your work and ideas and will keep adding more ideas to explore.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by [@hskhawaja](https://kaggle.com/hskhawaja)
### Licensing Information
The license for this dataset is cc-by-nc-sa-4.0
### Citation Information
```bibtex
[More Information Needed]
```
### Contributions
[More Information Needed] |
decomedeiros/ze_vaqueiro | ---
license: openrail
---
|
whatisslove11/80_ms | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': normal_speech
'1': whisper
'2': music
'3': scream
splits:
- name: train
num_bytes: 5918544229.648
num_examples: 417736
download_size: 5665644557
dataset_size: 5918544229.648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_80_1713094190 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3176857
num_examples: 8066
download_size: 1610525
dataset_size: 3176857
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
justram/AToMiC-Texts-v0.2-medium | ---
dataset_info:
features:
- name: text_id
dtype: string
- name: page_url
dtype: string
- name: page_title
dtype: string
- name: section_title
dtype: string
- name: context_page_description
dtype: string
- name: context_section_description
dtype: string
- name: media
sequence: string
- name: hierachy
sequence: string
- name: category
sequence: string
- name: source_id
dtype: string
splits:
- name: train
num_bytes: 5404754455.050775
num_examples: 3002458
- name: validation
num_bytes: 30913287.798392836
num_examples: 17173
- name: test
num_bytes: 17772485.321931664
num_examples: 9873
download_size: 2719090777
dataset_size: 5502126001.291424
---
# Dataset Card for "AToMiC-Texts-v0.2-medium"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sanaeai/tead1 | ---
dataset_info:
features:
- name: tweet
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1098060
num_examples: 12558
download_size: 603080
dataset_size: 1098060
---
# Dataset Card for "tead1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rosenberg/zhmsra | ---
license: mit
---
|
onghh0123/your_dataset_name | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: Section
dtype: string
- name: Details
dtype: string
splits:
- name: train
num_bytes: 4196007
num_examples: 1011
download_size: 2248694
dataset_size: 4196007
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kheopss/humorous_tone_dataset | ---
dataset_info:
features:
- name: assistant response
dtype: string
- name: response
dtype: string
- name: system
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 630048
num_examples: 114
download_size: 360639
dataset_size: 630048
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
izhx/sts17-crosslingual-extend | ---
size_categories:
- n<1K
---
This dataset is derived from [`mteb/sts17-crosslingual-sts`](https://huggingface.co/datasets/mteb/sts17-crosslingual-sts).
We translated `en-en` to `zh-zh` and `id-id` by ChatGPT.
|
open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B | ---
pretty_name: Evaluation run of v1olet/v1olet_merged_dpo_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [v1olet/v1olet_merged_dpo_7B](https://huggingface.co/v1olet/v1olet_merged_dpo_7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-12T12:46:34.899299](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B/blob/main/results_2023-12-12T12-46-34.899299.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6440679427811095,\n\
\ \"acc_stderr\": 0.03250447611787324,\n \"acc_norm\": 0.646485224747175,\n\
\ \"acc_norm_stderr\": 0.03316159386352293,\n \"mc1\": 0.5030599755201959,\n\
\ \"mc1_stderr\": 0.017503173260960625,\n \"mc2\": 0.6336908170998187,\n\
\ \"mc2_stderr\": 0.01575011120254939\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6766211604095563,\n \"acc_stderr\": 0.013669421630012136,\n\
\ \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274783\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7018522206731727,\n\
\ \"acc_stderr\": 0.004565098421085227,\n \"acc_norm\": 0.8734315873332006,\n\
\ \"acc_norm_stderr\": 0.003318093579702922\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895518,\n \"\
acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.0163994363666129,\n \"acc_norm\"\
: 0.8220183486238533,\n \"acc_norm_stderr\": 0.0163994363666129\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n\
\ \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654584,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654584\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990945,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990945\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381387,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381387\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\
\ \"acc_stderr\": 0.012713845972358986,\n \"acc_norm\": 0.4530638852672751,\n\
\ \"acc_norm_stderr\": 0.012713845972358986\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5030599755201959,\n\
\ \"mc1_stderr\": 0.017503173260960625,\n \"mc2\": 0.6336908170998187,\n\
\ \"mc2_stderr\": 0.01575011120254939\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.533737680060652,\n \
\ \"acc_stderr\": 0.01374109641222676\n }\n}\n```"
repo_url: https://huggingface.co/v1olet/v1olet_merged_dpo_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|arc:challenge|25_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|gsm8k|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hellaswag|10_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T12-46-34.899299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T12-46-34.899299.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- '**/details_harness|winogrande|5_2023-12-12T12-46-34.899299.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-12T12-46-34.899299.parquet'
- config_name: results
data_files:
- split: 2023_12_12T12_46_34.899299
path:
- results_2023-12-12T12-46-34.899299.parquet
- split: latest
path:
- results_2023-12-12T12-46-34.899299.parquet
---
# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [v1olet/v1olet_merged_dpo_7B](https://huggingface.co/v1olet/v1olet_merged_dpo_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-12T12:46:34.899299](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B/blob/main/results_2023-12-12T12-46-34.899299.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6440679427811095,
"acc_stderr": 0.03250447611787324,
"acc_norm": 0.646485224747175,
"acc_norm_stderr": 0.03316159386352293,
"mc1": 0.5030599755201959,
"mc1_stderr": 0.017503173260960625,
"mc2": 0.6336908170998187,
"mc2_stderr": 0.01575011120254939
},
"harness|arc:challenge|25": {
"acc": 0.6766211604095563,
"acc_stderr": 0.013669421630012136,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274783
},
"harness|hellaswag|10": {
"acc": 0.7018522206731727,
"acc_stderr": 0.004565098421085227,
"acc_norm": 0.8734315873332006,
"acc_norm_stderr": 0.003318093579702922
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.0163994363666129,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.0163994363666129
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654584,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654584
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990945,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990945
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381387,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381387
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358986,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358986
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5030599755201959,
"mc1_stderr": 0.017503173260960625,
"mc2": 0.6336908170998187,
"mc2_stderr": 0.01575011120254939
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.533737680060652,
"acc_stderr": 0.01374109641222676
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilabel-internal-testing/SystemChat-1.1-tiny | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 21166.74515235457
num_examples: 10
download_size: 15489
dataset_size: 21166.74515235457
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-adversarial_qa-adversarialQA-cd62e4-67882145606 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: Akihiro2/bert-finetuned-squad
metrics: []
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: Akihiro2/bert-finetuned-squad
* Dataset: adversarial_qa
* Config: adversarialQA
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@zhouzj](https://huggingface.co/zhouzj) for evaluating this model. |
azhx/counterfact | ---
dataset_info:
features:
- name: case_id
dtype: int64
- name: pararel_idx
dtype: int64
- name: requested_rewrite
struct:
- name: prompt
dtype: string
- name: relation_id
dtype: string
- name: subject
dtype: string
- name: target_new
struct:
- name: id
dtype: string
- name: str
dtype: string
- name: target_true
struct:
- name: id
dtype: string
- name: str
dtype: string
- name: paraphrase_prompts
sequence: string
- name: neighborhood_prompts
sequence: string
- name: attribute_prompts
sequence: string
- name: generation_prompts
sequence: string
splits:
- name: train
num_bytes: 29388723
num_examples: 19728
- name: test
num_bytes: 3268668
num_examples: 2191
download_size: 12387190
dataset_size: 32657391
---
# Dataset Card for "counterfact"
Dataset from [ROME](https://rome.baulab.info/) by Meng et al.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kgr123/quality_counter_4608_4_uniq | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 553222362
num_examples: 20000
- name: validation
num_bytes: 222944090
num_examples: 8000
- name: test
num_bytes: 56238328
num_examples: 2300
download_size: 26388335
dataset_size: 832404780
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_cognitivecomputations__mixtral-instruct-0.1-laser | ---
pretty_name: Evaluation run of cognitivecomputations/mixtral-instruct-0.1-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/mixtral-instruct-0.1-laser](https://huggingface.co/cognitivecomputations/mixtral-instruct-0.1-laser)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__mixtral-instruct-0.1-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T22:00:06.190625](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__mixtral-instruct-0.1-laser/blob/main/results_2024-02-22T22-00-06.190625.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7091955694047882,\n\
\ \"acc_stderr\": 0.03040413149190394,\n \"acc_norm\": 0.713318282242631,\n\
\ \"acc_norm_stderr\": 0.030988909195049447,\n \"mc1\": 0.5067319461444308,\n\
\ \"mc1_stderr\": 0.017501914492655393,\n \"mc2\": 0.6582573067262687,\n\
\ \"mc2_stderr\": 0.015095610656901154\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.013688147309729117,\n\
\ \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.01332975029338232\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.682832105158335,\n\
\ \"acc_stderr\": 0.0046442232947277225,\n \"acc_norm\": 0.8728340967934675,\n\
\ \"acc_norm_stderr\": 0.003324778429495362\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.025447863825108604,\n\
\ \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.025447863825108604\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n\
\ \"acc_stderr\": 0.029514245964291766,\n \"acc_norm\": 0.8541666666666666,\n\
\ \"acc_norm_stderr\": 0.029514245964291766\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n\
\ \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n\
\ \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6468085106382979,\n \"acc_stderr\": 0.031245325202761926,\n\
\ \"acc_norm\": 0.6468085106382979,\n \"acc_norm_stderr\": 0.031245325202761926\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.04598188057816542,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.04598188057816542\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"\
acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n\
\ \"acc_stderr\": 0.021576248184514583,\n \"acc_norm\": 0.8258064516129032,\n\
\ \"acc_norm_stderr\": 0.021576248184514583\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.03465304488406796,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562094,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562094\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.023290888053772725,\n\
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.023290888053772725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8067226890756303,\n \"acc_stderr\": 0.025649470265889183,\n\
\ \"acc_norm\": 0.8067226890756303,\n \"acc_norm_stderr\": 0.025649470265889183\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8752293577981651,\n \"acc_stderr\": 0.014168298359156327,\n \"\
acc_norm\": 0.8752293577981651,\n \"acc_norm_stderr\": 0.014168298359156327\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.03338473403207401,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.03338473403207401\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.023094329582595698,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.023094329582595698\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n\
\ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n\
\ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752596,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752596\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305738,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305738\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.022497230190967554,\n\
\ \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.022497230190967554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.01659339422756484,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.01659339422756484\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.022140767512880948,\n\
\ \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.022140767512880948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225153,\n\
\ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225153\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5354609929078015,\n \"acc_stderr\": 0.029752389657427054,\n \
\ \"acc_norm\": 0.5354609929078015,\n \"acc_norm_stderr\": 0.029752389657427054\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.530638852672751,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.530638852672751,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7867647058823529,\n \"acc_stderr\": 0.024880971512294254,\n\
\ \"acc_norm\": 0.7867647058823529,\n \"acc_norm_stderr\": 0.024880971512294254\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7843137254901961,\n \"acc_stderr\": 0.016639319350313264,\n \
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.016639319350313264\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073153,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073153\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166323,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5067319461444308,\n\
\ \"mc1_stderr\": 0.017501914492655393,\n \"mc2\": 0.6582573067262687,\n\
\ \"mc2_stderr\": 0.015095610656901154\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5868081880212282,\n \
\ \"acc_stderr\": 0.01356332695198437\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/mixtral-instruct-0.1-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|arc:challenge|25_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|gsm8k|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hellaswag|10_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T22-00-06.190625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T22-00-06.190625.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- '**/details_harness|winogrande|5_2024-02-22T22-00-06.190625.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T22-00-06.190625.parquet'
- config_name: results
data_files:
- split: 2024_02_22T22_00_06.190625
path:
- results_2024-02-22T22-00-06.190625.parquet
- split: latest
path:
- results_2024-02-22T22-00-06.190625.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/mixtral-instruct-0.1-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/mixtral-instruct-0.1-laser](https://huggingface.co/cognitivecomputations/mixtral-instruct-0.1-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__mixtral-instruct-0.1-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T22:00:06.190625](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__mixtral-instruct-0.1-laser/blob/main/results_2024-02-22T22-00-06.190625.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7091955694047882,
"acc_stderr": 0.03040413149190394,
"acc_norm": 0.713318282242631,
"acc_norm_stderr": 0.030988909195049447,
"mc1": 0.5067319461444308,
"mc1_stderr": 0.017501914492655393,
"mc2": 0.6582573067262687,
"mc2_stderr": 0.015095610656901154
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.013688147309729117,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.01332975029338232
},
"harness|hellaswag|10": {
"acc": 0.682832105158335,
"acc_stderr": 0.0046442232947277225,
"acc_norm": 0.8728340967934675,
"acc_norm_stderr": 0.003324778429495362
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7811320754716982,
"acc_stderr": 0.025447863825108604,
"acc_norm": 0.7811320754716982,
"acc_norm_stderr": 0.025447863825108604
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.029514245964291766,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.029514245964291766
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6468085106382979,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.6468085106382979,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.04598188057816542,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.04598188057816542
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514583,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514583
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562094,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562094
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.023290888053772725,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.023290888053772725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8067226890756303,
"acc_stderr": 0.025649470265889183,
"acc_norm": 0.8067226890756303,
"acc_norm_stderr": 0.025649470265889183
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8752293577981651,
"acc_stderr": 0.014168298359156327,
"acc_norm": 0.8752293577981651,
"acc_norm_stderr": 0.014168298359156327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.03338473403207401,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.03338473403207401
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.023094329582595698,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.023094329582595698
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7443946188340808,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.7443946188340808,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752596,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752596
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018533,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018533
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305738,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305738
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.022497230190967554,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.022497230190967554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.01659339422756484,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.01659339422756484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.022140767512880948,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.022140767512880948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225153,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225153
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5354609929078015,
"acc_stderr": 0.029752389657427054,
"acc_norm": 0.5354609929078015,
"acc_norm_stderr": 0.029752389657427054
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.530638852672751,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.530638852672751,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7867647058823529,
"acc_stderr": 0.024880971512294254,
"acc_norm": 0.7867647058823529,
"acc_norm_stderr": 0.024880971512294254
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.016639319350313264,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.016639319350313264
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073153,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166323,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5067319461444308,
"mc1_stderr": 0.017501914492655393,
"mc2": 0.6582573067262687,
"mc2_stderr": 0.015095610656901154
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.5868081880212282,
"acc_stderr": 0.01356332695198437
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hbin0701/self_Dpo | ---
license: apache-2.0
---
|
laion/laion5B-watermark-safety-ordered | Invalid username or password. |
FINNUMBER/FINCH_TRAIN_NQA | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 100808255
num_examples: 31269
download_size: 54781024
dataset_size: 100808255
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Doub7e/SDv2-Count-Repeated-4 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: T5_last_hidden_states
sequence:
sequence:
sequence: float32
- name: style
dtype: string
splits:
- name: train
num_bytes: 1476715482.5
num_examples: 1140
download_size: 1286923080
dataset_size: 1476715482.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mlfoundations/VisIT-Bench | ---
configs:
- config_name: default
data_files:
- split: test
path: "test/*"
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- found
paperswithcode_id: visit-bench
pretty_name: VisIT-Bench
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- vision-and-language
- instruction-following
- human-chatbot-interaction
- image-instruction-pairs
- multi-modal
- task-performance
task_ids: []
extra_gated_prompt: >-
By clicking “Access repository” below, you assert your intention to
exclusively use this resource for research, not for commercial chatbot
development, and agree to abide by the terms detailed in the [VisIT-Bench
license](https://visit-bench.github.io/static/pdfs/visit_bench_license_agreement.txt).
You may also view all instances through the [VisIT-Bench
Explorer](https://huggingface.co/spaces/mlfoundations/visit-bench-explorer-full)
and consult the accompanying [VisIT-Bench Dataset
card](https://huggingface.co/spaces/mlfoundations/visit-bench-explorer-full/blob/main/README.md)
prior to acceptance. If you are unsure about your specific case - do not
hesitate to reach out: visit-bench-support@gmail.com.
license: cc-by-4.0
---
# Dataset Card for VisIT-Bench
- [Dataset Description](#dataset-description)
- [Links](#links)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Data Loading](#data-loading)
- [Licensing Information](#licensing-information)
- [Annotations](#annotations)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Citation Information](#citation-information)
## Dataset Description
VisIT-Bench is a dataset and benchmark for vision-and-language instruction following. The dataset is comprised of image-instruction pairs and corresponding example outputs, spanning a wide range of tasks, from simple object recognition to complex reasoning tasks. The dataset provides a holistic view of chatbot capabilities.
The results show that state-of-the-art models such as GPT-4 and BLIP2 have a high success rate, but there is room for improvement.
## Links
Auto-evaluation repository: https://github.com/Hritikbansal/visit_bench_sandbox
All images in a zip file (including multi-images): https://visit-instruction-tuning.s3.amazonaws.com/visit_bench_images.zip
A CSV of the single-image dataset: https://visit-instruction-tuning.s3.amazonaws.com/single_image_full_dataset.csv
Multi-images dataset: https://visit-instruction-tuning.s3.amazonaws.com/multi_image_full_dataset.csv
Homepage: https://visit-bench.github.io/
Paper: https://arxiv.org/abs/2308.06595
GitHub: http://github.com/mlfoundations/Visit-Bench
Point of Contact: yonatanbitton1@gmail.com, hbansal@ucla.edu, jmhessel@gmail.com
## Dataset Structure
### Data Fields
instruction_category (string) - The category of the instruction
image_url (string) - The URL of the image in the instruction
image (image) - The image in the instruction
visual (string) - The visual details in the instruction
instruction (string) - The instruction itself
instruction_conditioned_caption (string) - a dense caption that allows a text-only model to correctly follow the instruction
reference_output (string) - The label obtained from the original source dataset if it exists.
human_ratings_gpt4_correct (boolean) - Human ratings indicating if GPT-4 correctly followed the instruction
human_ratings_problem_in_caption (boolean) - Human ratings indicating if there is a problem in the caption
human_ratings_problem_in_gpt4 (boolean) - Human ratings indicating if there is a problem in GPT-4's response
public_images_metadata (dictionary) - Metadata about the image
### Data Splits
The dataset currently has a single TEST split. Further splits will be provided in the future.
### Data Loading
You can load the data as follows (credit to [Hugging Face Datasets](https://huggingface.co/datasets)):
```
from datasets import load_dataset
examples = load_dataset('mlfoundations/visit-bench', use_auth_token=<YOUR USER ACCESS TOKEN>)
```
You can get `<YOUR USER ACCESS TOKEN>` by following these steps:
1) log into your Hugging Face account
2) click on your profile picture
3) click "Settings"
4) click "Access Tokens
5) generate a new token and use that in the `use_auth_token` field
## Licensing Information
The new contributions of our dataset (e.g., the instructions, reference outputs, model ranking annotations, etc.) are licensed under the Creative Commons Attribution 4.0 International License (CC BY 4.0).
All images used are publically licensed. Please refer to the public license attached to each individual image in the "public_images_metadata" field in the dataset sheets.
Alongside this license, the following conditions apply:
1. **Purpose:** The dataset was primarily designed for use as a test set.
2. **Commercial Use:** Commercially, the dataset may be used as a test set, but it's prohibited to use it as a training set.
By accessing or using this dataset, you acknowledge and agree to abide by these terms in conjunction with the CC BY 4.0 license.
## Annotations
The dataset is annotated using crowd workers on Amazon Mechanical Turk. Workers followed the steps detailed in the paper to generate the annotations. The instructions, reference outputs, and model ranking annotations were generated through this process.
## Considerations for Using the Data
Social Impact of Dataset: The dataset is aimed to facilitate research on AI models' ability to understand and follow instructions given in natural language and paired with visual inputs. Such research could contribute to the development of more interactive, capable, and intelligent AI systems. It could also illuminate areas where current AI technology falls short, informing future research directions.
Data Limitations: The dataset may not cover all possible types of instructions, particularly those requiring complex reasoning or advanced knowledge. The dataset was also created using crowd workers, and thus, may contain mistakes or inconsistencies.
Privacy: The images used in this dataset are publicly available. However, the exact source of the images is not disclosed in the dataset, protecting the privacy of the image creators to some extent. The workers who generated the instructions and annotations were also anonymized.
Curation Rationale: The dataset was curated to provide a broad range of instruction types and difficulty levels. The creators selected a mix of easy, medium, and hard instructions to challenge current AI capabilities.
## Citation Information
@misc{bitton2023visitbench,
title={VisIT-Bench: A Benchmark for Vision-Language Instruction Following Inspired by Real-World Use},
author={Yonatan Bitton and Hritik Bansal and Jack Hessel and Rulin Shao and Wanrong Zhu and Anas Awadalla and Josh Gardner and Rohan Taori and Ludwig Schimdt},
year={2023},
eprint={2308.06595},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
|
newsmediabias/fake_news_elections_labelled_data | ---
license: mit
task_categories:
- text-classification
language:
- en
pretty_name: fake-news-elections
---
# Dataset Card for Election-Related Fake News Classification
## Dataset Summary
This dataset is designed for the task of fake news classification in the context of elections. It consists of news articles, social media posts, and other text sources related to various elections worldwide. Each entry in the dataset is labeled as 'fake' or 'real' based on its content and the veracity of the information presented.
https://arxiv.org/abs/2312.03750
### Languages
English
### Data Instances
A typical data instance comprises:
- **Text:** The content of the news article or post.
- **Label:** A binary label, where '0' indicates 'real' news and '1' indicates 'fake' news.
Example:
```json
{
"text": "The president announced a new policy today...",
"label": REAL
}
```
#### Annotation process
Annotations were performed by using LLMs and which is verified by subject matter experts who checked each text as 'real' or 'fake' based on factual accuracy and context.
### Social Impact of Dataset
This dataset plays a crucial role in combating the spread of misinformation during elections, which is vital for maintaining the integrity of democratic processes.
### Discussion of Biases
There may be biases in the dataset due to the predominance of certain sources or the subjective nature of some news categorizations.
## Citation
If you use this dataset in your research, please cite it as follows:
```bibtex
@article{rahman2023analyzing,
title={Analyzing the Influence of Fake News in the 2024 Elections: A Comprehensive Dataset},
author={Rahman, Mizanur and Raza, Shaina},
journal={arXiv preprint arXiv:2312.03750},
year={2023}
}
|
mask-distilled-one-sec-cv12/chunk_180 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 965748720
num_examples: 189660
download_size: 986193346
dataset_size: 965748720
---
# Dataset Card for "chunk_180"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_aint_have | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 1497
num_examples: 11
- name: test
num_bytes: 2644
num_examples: 17
- name: train
num_bytes: 48813
num_examples: 401
download_size: 25500
dataset_size: 52954
---
# Dataset Card for "MULTI_VALUE_sst2_aint_have"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Venkateshwarang/Task_1_Demodataset | ---
dataset_info:
features:
- name: english to thanglish
dtype: string
splits:
- name: train
num_bytes: 16292
num_examples: 63
download_size: 9165
dataset_size: 16292
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Babelscape/multinerd | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- de
- en
- es
- fr
- it
- nl
- pl
- pt
- ru
- zh
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: multinerd-dataset
tags:
- structure-prediction
---
## Table of Contents
- [Description](#description)
- [Dataset Structure](#dataset-structure)
- [Additional Information](#additional-information)
## Dataset Card for MultiNERD dataset
## Dataset Description
- **Summary:** Training data for fine-grained NER in 10 languages.
- **Repository:** [https://github.com/Babelscape/multinerd](https://github.com/Babelscape/multinerd)
- **Paper:** [https://aclanthology.org/multinerd](https://aclanthology.org/2022.findings-naacl.60/)
- **Point of Contact:** [tedeschi@babelscape.com](tedeschi@babelscape.com)
## Description
- **Summary:** In a nutshell, MultiNERD is the first **language-agnostic** methodology for automatically creating **multilingual, multi-genre and fine-grained annotations** for **Named Entity Recognition** and **Entity Disambiguation**. Specifically, it can be seen an extension of the combination of two prior works from our research group that are [WikiNEuRal](https://www.github.com/Babelscape/wikineural), from which we took inspiration for the state-of-the-art silver-data creation methodology, and [NER4EL](https://www.github.com/Babelscape/NER4EL), from which we took the fine-grained classes and inspiration for the entity linking part. The produced dataset covers: **10 languages** (Chinese, Dutch, English, French, German, Italian, Polish, Portuguese, Russian and Spanish), **15 NER categories** (Person (PER), Location (LOC), Organization (ORG}), Animal (ANIM), Biological entity (BIO), Celestial Body (CEL), Disease (DIS), Event (EVE), Food (FOOD), Instrument (INST), Media (MEDIA), Plant (PLANT), Mythological entity (MYTH), Time (TIME) and Vehicle (VEHI)), and **2 textual genres** ([Wikipedia](https://www.wikipedia.org/) and [WikiNews](https://www.wikinews.org/));
- **Repository:** [https://github.com/Babelscape/multinerd](https://github.com/Babelscape/multinerd)
- **Paper:** [https://aclanthology.org/multinerd](https://aclanthology.org/2022.findings-naacl.60/)
- **Point of Contact:** [tedeschi@babelscape.com](tedeschi@babelscape.com)
## Dataset Structure
The data fields are the same among all splits.
- `tokens`: a `list` of `string` features.
- `ner_tags`: a `list` of classification labels (`int`).
- `lang`: a `string` feature. Full list of language: Chinese (zh), Dutch (nl), English (en), French (fr), German (de), Italian (it), Polish (pl), Portugues (pt), Russian (ru), Spanish (es).
- The full tagset with indices is reported below:
```python
{
"O": 0,
"B-PER": 1,
"I-PER": 2,
"B-ORG": 3,
"I-ORG": 4,
"B-LOC": 5,
"I-LOC": 6,
"B-ANIM": 7,
"I-ANIM": 8,
"B-BIO": 9,
"I-BIO": 10,
"B-CEL": 11,
"I-CEL": 12,
"B-DIS": 13,
"I-DIS": 14,
"B-EVE": 15,
"I-EVE": 16,
"B-FOOD": 17,
"I-FOOD": 18,
"B-INST": 19,
"I-INST": 20,
"B-MEDIA": 21,
"I-MEDIA": 22,
"B-MYTH": 23,
"I-MYTH": 24,
"B-PLANT": 25,
"I-PLANT": 26,
"B-TIME": 27,
"I-TIME": 28,
"B-VEHI": 29,
"I-VEHI": 30,
}
```
## Additional Information
- **Licensing Information**: Contents of this repository are restricted to only non-commercial research purposes under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/). Copyright of the dataset contents belongs to the original copyright holders.
- **Citation Information**: Please consider citing our work if you use data and/or code from this repository.
```bibtex
@inproceedings{tedeschi-navigli-2022-multinerd,
title = "{M}ulti{NERD}: A Multilingual, Multi-Genre and Fine-Grained Dataset for Named Entity Recognition (and Disambiguation)",
author = "Tedeschi, Simone and
Navigli, Roberto",
booktitle = "Findings of the Association for Computational Linguistics: NAACL 2022",
month = jul,
year = "2022",
address = "Seattle, United States",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.findings-naacl.60",
doi = "10.18653/v1/2022.findings-naacl.60",
pages = "801--812",
abstract = "Named Entity Recognition (NER) is the task of identifying named entities in texts and classifying them through specific semantic categories, a process which is crucial for a wide range of NLP applications. Current datasets for NER focus mainly on coarse-grained entity types, tend to consider a single textual genre and to cover a narrow set of languages, thus limiting the general applicability of NER systems.In this work, we design a new methodology for automatically producing NER annotations, and address the aforementioned limitations by introducing a novel dataset that covers 10 languages, 15 NER categories and 2 textual genres.We also introduce a manually-annotated test set, and extensively evaluate the quality of our novel dataset on both this new test set and standard benchmarks for NER.In addition, in our dataset, we include: i) disambiguation information to enable the development of multilingual entity linking systems, and ii) image URLs to encourage the creation of multimodal systems.We release our dataset at https://github.com/Babelscape/multinerd.",
}
```
- **Contributions**: Thanks to [@sted97](https://github.com/sted97) for adding this dataset.
|
CyberHarem/ashley_lapisrelights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ashley (Lapis Re:LiGHTs)
This is the dataset of Ashley (Lapis Re:LiGHTs), containing 243 images and their tags.
The core tags of this character are `purple_hair, long_hair, hair_between_eyes, ponytail, purple_eyes, bangs, bow, hair_bow, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 243 | 142.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashley_lapisrelights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 243 | 121.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashley_lapisrelights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 476 | 220.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashley_lapisrelights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 243 | 141.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashley_lapisrelights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 476 | 248.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashley_lapisrelights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ashley_lapisrelights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, black_gloves, fingerless_gloves, solo, hair_ribbon, outdoors, day, high_ponytail, medium_breasts, sleeveless_shirt, upper_body, bare_shoulders, black_ribbon, closed_mouth, cloud, bike_shorts, blue_sky, looking_at_viewer |
| 1 | 11 |  |  |  |  |  | 1girl, solo, closed_mouth, hair_ribbon, high_ponytail, upper_body, sailor_collar, short_sleeves, school_uniform, indoors, looking_at_viewer, crossed_arms, frills, shirt, collarbone, red_neckerchief |
| 2 | 38 |  |  |  |  |  | black_shirt, 1girl, solo, short_sleeves, black_bow, closed_mouth, collared_shirt, single_braid, upper_body, indoors, braided_ponytail, breasts, red_skirt, sitting, smile |
| 3 | 7 |  |  |  |  |  | dress, frills, hair_ribbon, solo_focus, closed_mouth, high_ponytail, collarbone, red_ascot, sailor_collar, 2girls, black_ribbon, looking_at_viewer, medium_breasts, short_sleeves, sleeveless, smile, underbust |
| 4 | 5 |  |  |  |  |  | 1girl, black_gloves, dress, fingerless_gloves, solo, breasts, open_mouth, closed_eyes, outdoors, pantyhose, rain |
| 5 | 10 |  |  |  |  |  | 1girl, collarbone, solo, indoors, upper_body, closed_mouth, curtains, dress, long_sleeves, looking_at_viewer, window, breasts, shirt |
| 6 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, solo, dress, detached_collar, detached_sleeves, looking_at_viewer, medium_breasts, single_braid, upper_body |
| 7 | 5 |  |  |  |  |  | 1girl, open_mouth, solo, sweat, blue_eyes, constricted_pupils |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | fingerless_gloves | solo | hair_ribbon | outdoors | day | high_ponytail | medium_breasts | sleeveless_shirt | upper_body | bare_shoulders | black_ribbon | closed_mouth | cloud | bike_shorts | blue_sky | looking_at_viewer | sailor_collar | short_sleeves | school_uniform | indoors | crossed_arms | frills | shirt | collarbone | red_neckerchief | black_shirt | black_bow | collared_shirt | single_braid | braided_ponytail | breasts | red_skirt | sitting | smile | dress | solo_focus | red_ascot | 2girls | sleeveless | underbust | open_mouth | closed_eyes | pantyhose | rain | curtains | long_sleeves | window | cleavage | detached_collar | detached_sleeves | sweat | blue_eyes | constricted_pupils |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:-------|:--------------|:-----------|:------|:----------------|:-----------------|:-------------------|:-------------|:-----------------|:---------------|:---------------|:--------|:--------------|:-----------|:--------------------|:----------------|:----------------|:-----------------|:----------|:---------------|:---------|:--------|:-------------|:------------------|:--------------|:------------|:-----------------|:---------------|:-------------------|:----------|:------------|:----------|:--------|:--------|:-------------|:------------|:---------|:-------------|:------------|:-------------|:--------------|:------------|:-------|:-----------|:---------------|:---------|:-----------|:------------------|:-------------------|:--------|:------------|:---------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | | X | X | | | X | | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 38 |  |  |  |  |  | X | | | X | | | | | | | X | | | X | | | | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | | | | | X | | | X | X | | | | X | X | | | | X | X | X | | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | X | X | X | X | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | | | X | | | | | | | X | | | X | | | | X | | | | X | | | X | X | | | | | | | X | | | | X | | | | | | | | | | X | X | X | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | | | | | X | | X | X | | | | | | X | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | X | X | X | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.