datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
mariosasko/test_push_split | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- multilingual
size_categories:
ab:
- n<1K
ar:
- 10K<n<100K
as:
- n<1K
br:
- 10K<n<100K
ca:
- 100K<n<1M
cnh:
- 1K<n<10K
cs:
- 10K<n<100K
cv:
- 10K<n<100K
cy:
- 10K<n<100K
de:
- 100K<n<1M
dv:
- 10K<n<100K
el:
- 10K<n<100K
en:
- 1M<n<10M
eo:
- 10K<n<100K
es:
- 100K<n<1M
et:
- 10K<n<100K
eu:
- 10K<n<100K
fa:
- 100K<n<1M
fi:
- 1K<n<10K
fr:
- 100K<n<1M
fy-NL:
- 10K<n<100K
ga-IE:
- 1K<n<10K
hi:
- n<1K
hsb:
- 1K<n<10K
hu:
- 1K<n<10K
ia:
- 1K<n<10K
id:
- 10K<n<100K
it:
- 100K<n<1M
ja:
- 1K<n<10K
ka:
- 1K<n<10K
kab:
- 100K<n<1M
ky:
- 10K<n<100K
lg:
- 1K<n<10K
lt:
- 1K<n<10K
lv:
- 1K<n<10K
mn:
- 10K<n<100K
mt:
- 10K<n<100K
nl:
- 10K<n<100K
or:
- 1K<n<10K
pa-IN:
- 1K<n<10K
pl:
- 100K<n<1M
pt:
- 10K<n<100K
rm-sursilv:
- 1K<n<10K
rm-vallader:
- 1K<n<10K
ro:
- 1K<n<10K
ru:
- 10K<n<100K
rw:
- 1M<n<10M
sah:
- 1K<n<10K
sl:
- 1K<n<10K
sv-SE:
- 10K<n<100K
ta:
- 10K<n<100K
th:
- 10K<n<100K
tr:
- 10K<n<100K
tt:
- 10K<n<100K
uk:
- 10K<n<100K
vi:
- 1K<n<10K
vot:
- n<1K
zh-CN:
- 10K<n<100K
zh-HK:
- 10K<n<100K
zh-TW:
- 10K<n<100K
source_datasets:
- extended|common_voice
paperswithcode_id: common-voice
pretty_name: Common Voice Corpus 6.1
language_bcp47:
- ab
- ar
- as
- br
- ca
- cnh
- cs
- cv
- cy
- de
- dv
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy-NL
- ga-IE
- hi
- hsb
- hu
- ia
- id
- it
- ja
- ka
- kab
- ky
- lg
- lt
- lv
- mn
- mt
- nl
- or
- pa-IN
- pl
- pt
- rm-sursilv
- rm-vallader
- ro
- ru
- rw
- sah
- sl
- sv-SE
- ta
- th
- tr
- tt
- uk
- vi
- vot
- zh-CN
- zh-HK
- zh-TW
extra_gated_prompt: By clicking on “Access repository” below, you also agree to not
attempt to determine the identity of speakers in the Common Voice dataset.
task_categories:
- automatic-speech-recognition
---
# Dataset Card for Common Voice Corpus 6.1
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Anton Lozhkov](mailto:anton@huggingface.co)
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 9283 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 7335 validated hours in 60 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Supported Tasks and Leaderboards
The results for models trained on the Common Voice datasets are available via the
[🤗 Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench)
### Languages
```
Abkhaz, Arabic, Assamese, Basque, Breton, Catalan, Chinese (China), Chinese (Hong Kong), Chinese (Taiwan), Chuvash, Czech, Dhivehi, Dutch, English, Esperanto, Estonian, Finnish, French, Frisian, Georgian, German, Greek, Hakha Chin, Hindi, Hungarian, Indonesian, Interlingua, Irish, Italian, Japanese, Kabyle, Kinyarwanda, Kyrgyz, Latvian, Lithuanian, Luganda, Maltese, Mongolian, Odia, Persian, Polish, Portuguese, Punjabi, Romanian, Romansh Sursilvan, Romansh Vallader, Russian, Sakha, Slovenian, Sorbian, Upper, Spanish, Swedish, Tamil, Tatar, Thai, Turkish, Ukrainian, Vietnamese, Votic, Welsh
```
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_6_1", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
|
open-llm-leaderboard/details_Corianas__Quokka_2.7b | ---
pretty_name: Evaluation run of Corianas/Quokka_2.7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Corianas/Quokka_2.7b](https://huggingface.co/Corianas/Quokka_2.7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__Quokka_2.7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T03:05:58.053951](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_2.7b/blob/main/results_2023-09-18T03-05-58.053951.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.027055369127516778,\n\
\ \"em_stderr\": 0.0016615386418947858,\n \"f1\": 0.0843078859060403,\n\
\ \"f1_stderr\": 0.0021162612701253174,\n \"acc\": 0.27932236818091244,\n\
\ \"acc_stderr\": 0.007830181847252834\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.027055369127516778,\n \"em_stderr\": 0.0016615386418947858,\n\
\ \"f1\": 0.0843078859060403,\n \"f1_stderr\": 0.0021162612701253174\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501802\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5548539857932123,\n \"acc_stderr\": 0.013967662954355487\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Corianas/Quokka_2.7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T03_05_58.053951
path:
- '**/details_harness|drop|3_2023-09-18T03-05-58.053951.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T03-05-58.053951.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T03_05_58.053951
path:
- '**/details_harness|gsm8k|5_2023-09-18T03-05-58.053951.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T03-05-58.053951.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:58:12.174583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:58:12.174583.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:58:12.174583.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T03_05_58.053951
path:
- '**/details_harness|winogrande|5_2023-09-18T03-05-58.053951.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T03-05-58.053951.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_58_12.174583
path:
- results_2023-07-19T15:58:12.174583.parquet
- split: 2023_09_18T03_05_58.053951
path:
- results_2023-09-18T03-05-58.053951.parquet
- split: latest
path:
- results_2023-09-18T03-05-58.053951.parquet
---
# Dataset Card for Evaluation run of Corianas/Quokka_2.7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/Quokka_2.7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/Quokka_2.7b](https://huggingface.co/Corianas/Quokka_2.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__Quokka_2.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T03:05:58.053951](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_2.7b/blob/main/results_2023-09-18T03-05-58.053951.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.027055369127516778,
"em_stderr": 0.0016615386418947858,
"f1": 0.0843078859060403,
"f1_stderr": 0.0021162612701253174,
"acc": 0.27932236818091244,
"acc_stderr": 0.007830181847252834
},
"harness|drop|3": {
"em": 0.027055369127516778,
"em_stderr": 0.0016615386418947858,
"f1": 0.0843078859060403,
"f1_stderr": 0.0021162612701253174
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501802
},
"harness|winogrande|5": {
"acc": 0.5548539857932123,
"acc_stderr": 0.013967662954355487
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nannullna/laion_subset | ---
configs:
- config_name: default
data_files:
- split: artwork
path: data/artwork-*
- split: person
path: data/person-*
- split: object
path: data/object-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: url
dtype: string
- name: punsafe
dtype: float64
- name: pwatermark
dtype: float64
splits:
- name: artwork
num_bytes: 235558764.0
num_examples: 452
- name: person
num_bytes: 254743194.0
num_examples: 501
- name: object
num_bytes: 57867679.0
num_examples: 114
download_size: 548177028
dataset_size: 548169637.0
---
# Dataset Card for "laion_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bsaurav/biography | ---
license: apache-2.0
---
|
harpreetsahota/one_shot_comparison_with_results | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: rationale
dtype: string
- name: task
dtype: string
- name: type
dtype: string
- name: decilm_generation
dtype: string
- name: mistral_generation
dtype: string
- name: mpt_generation
dtype: string
- name: decilm_generation_evaluation
dtype: string
- name: mistral_generation_evaluation
dtype: string
- name: mpt_generation_evaluation
dtype: string
- name: decilm_generation_score
dtype: int64
- name: mistral_generation_score
dtype: int64
- name: mpt_generation_score
dtype: int64
splits:
- name: train
num_bytes: 81749
num_examples: 30
download_size: 57106
dataset_size: 81749
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/queen_elizabeth_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of queen_elizabeth/クイーン・エリザベス/伊丽莎白女王 (Azur Lane)
This is the dataset of queen_elizabeth/クイーン・エリザベス/伊丽莎白女王 (Azur Lane), containing 335 images and their tags.
The core tags of this character are `blonde_hair, long_hair, blue_eyes, crown, bow, hairband, mini_crown, hair_bow, bangs, black_hairband, fang, breasts, white_bow, small_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 335 | 418.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/queen_elizabeth_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 335 | 249.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/queen_elizabeth_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 750 | 506.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/queen_elizabeth_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 335 | 373.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/queen_elizabeth_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 750 | 715.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/queen_elizabeth_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/queen_elizabeth_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, white_gloves, blush, detached_sleeves, dress, open_mouth, solo, covered_navel, simple_background, white_background, looking_at_viewer, bare_shoulders, :d, white_thighhighs |
| 1 | 5 |  |  |  |  |  | 1girl, :d, holding, open_mouth, scepter, solo, white_gloves, blush, chibi, detached_sleeves, dress, simple_background, staff, v-shaped_eyebrows, white_background, white_thighhighs, covered_navel, full_body, looking_at_viewer, machinery, turret, detached_collar, flat_chest, strapless, zettai_ryouiki |
| 2 | 5 |  |  |  |  |  | 1girl, detached_sleeves, holding_cup, looking_at_viewer, sitting, solo, teacup, white_gloves, white_thighhighs, blush, dress, :d, chair, open_mouth, outdoors, saucer, skin_fang |
| 3 | 12 |  |  |  |  |  | blue_dress, maid_headdress, open_mouth, 1girl, maid_apron, teacup, white_apron, sitting, white_gloves, looking_at_viewer, official_alternate_costume, chair, frilled_apron, holding_cup, solo, :d, crossed_legs, short_sleeves, skin_fang, table, food, high_heels, white_thighhighs |
| 4 | 9 |  |  |  |  |  | 1girl, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, solo, strapless_leotard, detached_collar, blue_leotard, blush, covered_navel, rabbit_tail, white_background, bare_shoulders, black_pantyhose, blue_bowtie, fishnet_pantyhose, simple_background, wrist_cuffs |
| 5 | 9 |  |  |  |  |  | 1girl, solo, blush, hat, white_shirt, blue_headwear, looking_at_viewer, open_mouth, smile, short_sleeves, blue_skirt, neck_ribbon, petals, puffy_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | white_gloves | blush | detached_sleeves | dress | open_mouth | solo | covered_navel | simple_background | white_background | looking_at_viewer | bare_shoulders | :d | white_thighhighs | holding | scepter | chibi | staff | v-shaped_eyebrows | full_body | machinery | turret | detached_collar | flat_chest | strapless | zettai_ryouiki | holding_cup | sitting | teacup | chair | outdoors | saucer | skin_fang | blue_dress | maid_headdress | maid_apron | white_apron | official_alternate_costume | frilled_apron | crossed_legs | short_sleeves | table | food | high_heels | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | blue_leotard | rabbit_tail | black_pantyhose | blue_bowtie | fishnet_pantyhose | wrist_cuffs | hat | white_shirt | blue_headwear | smile | blue_skirt | neck_ribbon | petals | puffy_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:-------------------|:--------|:-------------|:-------|:----------------|:--------------------|:-------------------|:--------------------|:-----------------|:-----|:-------------------|:----------|:----------|:--------|:--------|:--------------------|:------------|:------------|:---------|:------------------|:-------------|:------------|:-----------------|:--------------|:----------|:---------|:--------|:-----------|:---------|:------------|:-------------|:-----------------|:-------------|:--------------|:-----------------------------|:----------------|:---------------|:----------------|:--------|:-------|:-------------|:-------------------|:----------------|:--------------|:--------------------|:---------------|:--------------|:------------------|:--------------|:--------------------|:--------------|:------|:--------------|:----------------|:--------|:-------------|:--------------|:---------|:----------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | | | | X | X | | | | X | | X | X | | | | | | | | | | | | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | X | | | | X | X | X | X | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
savabdoul/mycustom-medical-data-llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 78173
num_examples: 188
download_size: 21282
dataset_size: 78173
---
# Dataset Card for "mycustom-medical-data-llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
esalesky/ted-im2im-zhcn-en | ---
configs:
- config_name: en
default: true
data_files:
- split: train
path: "train-en.parquet"
- split: val
path: "val-en.parquet"
- split: test
path: "test-en.parquet"
features:
text:
dtype: string
id: null
_type: Value
filename:
dtype: string
id: null
_type: Value
image:
decode: true
id: null
_type: Image
- config_name: zhcn
data_files:
- split: train
path: "train-zhcn.parquet"
- split: val
path: "val-zhcn.parquet"
- split: test
path: "test-zhcn.parquet"
features:
text:
dtype: string
id: null
_type: Value
filename:
dtype: string
id: null
_type: Value
image:
decode: true
id: null
_type: Image
- config_name: zhcnen
data_files:
- split: train
path: "train-zhcnen.parquet"
- split: val
path: "val-zhcnen.parquet"
- split: test
path: "test-zhcnen.parquet"
features:
text:
dtype: string
id: null
_type: Value
filename:
dtype: string
id: null
_type: Value
image:
decode: true
id: null
_type: Image
- config_name: enzhcn
data_files:
- split: train
path: "train-enzhcn.parquet"
- split: val
path: "val-enzhcn.parquet"
- split: test
path: "test-enzhcn.parquet"
features:
text:
dtype: string
id: null
_type: Value
filename:
dtype: string
id: null
_type: Value
image:
decode: true
id: null
_type: Image
---
|
cleanrl/summarize_from_feedback_oai_preprocessing_1704427060 | ---
dataset_info:
features:
- name: info
struct:
- name: id
dtype: string
- name: post
dtype: string
- name: title
dtype: string
- name: subreddit
dtype: string
- name: site
dtype: string
- name: article
dtype: string
- name: summaries
list:
- name: text
dtype: string
- name: policy
dtype: string
- name: note
dtype: string
- name: choice
dtype: int32
- name: worker
dtype: string
- name: batch
dtype: string
- name: split
dtype: string
- name: extra
struct:
- name: confidence
dtype: int32
- name: query_token
sequence: int64
- name: query
dtype: string
- name: response0
dtype: string
- name: response0_token
sequence: int64
- name: response0_token_len
dtype: int64
- name: response1
dtype: string
- name: response1_token
sequence: int64
- name: response1_token_len
dtype: int64
- name: response0_policy
dtype: string
- name: response1_policy
dtype: string
- name: policies
dtype: string
- name: query_response0
dtype: string
- name: query_response0_token
sequence: int64
- name: query_response0_token_len
dtype: int64
- name: query_response1
dtype: string
- name: query_response1_token
sequence: int64
- name: query_response1_token_len
dtype: int64
splits:
- name: train
num_bytes: 2210564467
num_examples: 92858
- name: validation
num_bytes: 2103952346
num_examples: 86086
download_size: 278205924
dataset_size: 4314516813
---
# Dataset Card for "summarize_from_feedback_oai_preprocessing_1704427060"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-samsum-samsum-ccdc20-93203145794 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: ainize/bart-base-cnn
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ainize/bart-base-cnn
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sasha](https://huggingface.co/sasha) for evaluating this model. |
open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1 | ---
pretty_name: Evaluation run of PSanni/MPOMixtral-8x7B-Instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PSanni/MPOMixtral-8x7B-Instruct-v0.1](https://huggingface.co/PSanni/MPOMixtral-8x7B-Instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T14:23:30.207507](https://huggingface.co/datasets/open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1/blob/main/results_2024-01-14T14-23-30.207507.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7021378898937154,\n\
\ \"acc_stderr\": 0.03050401556178155,\n \"acc_norm\": 0.7057392541812927,\n\
\ \"acc_norm_stderr\": 0.031097861836160572,\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.665216588266765,\n\
\ \"mc2_stderr\": 0.014619883028401507\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6800341296928327,\n \"acc_stderr\": 0.013631345807016193,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520764\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.690300736904999,\n\
\ \"acc_stderr\": 0.004614246282055377,\n \"acc_norm\": 0.8795060744871539,\n\
\ \"acc_norm_stderr\": 0.0032487292211528865\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n\
\ \"acc_stderr\": 0.0399926287661772,\n \"acc_norm\": 0.6888888888888889,\n\
\ \"acc_norm_stderr\": 0.0399926287661772\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.02619980880756193,\n\
\ \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.02619980880756193\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n\
\ \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n\
\ \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.03036358219723817,\n\
\ \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.03036358219723817\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\
\ \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n\
\ \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"\
acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8387096774193549,\n \"acc_stderr\": 0.020923327006423294,\n \"\
acc_norm\": 0.8387096774193549,\n \"acc_norm_stderr\": 0.020923327006423294\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822033,\n \"\
acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822033\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343336,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343336\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.01438543285747646,\n\
\ \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.01438543285747646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776823,\n\
\ \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.029869605095316908,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.029869605095316908\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279472,\n\
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279472\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8715596330275229,\n \"acc_stderr\": 0.014344977542914318,\n \"\
acc_norm\": 0.8715596330275229,\n \"acc_norm_stderr\": 0.014344977542914318\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025045,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924974,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n\
\ \"acc_stderr\": 0.011935626313999878,\n \"acc_norm\": 0.8722860791826309,\n\
\ \"acc_norm_stderr\": 0.011935626313999878\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n\
\ \"acc_stderr\": 0.01665722942458631,\n \"acc_norm\": 0.4558659217877095,\n\
\ \"acc_norm_stderr\": 0.01665722942458631\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8006535947712419,\n \"acc_stderr\": 0.02287581699346408,\n\
\ \"acc_norm\": 0.8006535947712419,\n \"acc_norm_stderr\": 0.02287581699346408\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225153,\n\
\ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225153\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5443285528031291,\n\
\ \"acc_stderr\": 0.012719949543032226,\n \"acc_norm\": 0.5443285528031291,\n\
\ \"acc_norm_stderr\": 0.012719949543032226\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7867647058823529,\n \"acc_stderr\": 0.024880971512294254,\n\
\ \"acc_norm\": 0.7867647058823529,\n \"acc_norm_stderr\": 0.024880971512294254\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904028,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904028\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166323,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.665216588266765,\n\
\ \"mc2_stderr\": 0.014619883028401507\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498428\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5852918877937832,\n \
\ \"acc_stderr\": 0.013570623842304504\n }\n}\n```"
repo_url: https://huggingface.co/PSanni/MPOMixtral-8x7B-Instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|arc:challenge|25_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|gsm8k|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hellaswag|10_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T14-23-30.207507.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T14-23-30.207507.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- '**/details_harness|winogrande|5_2024-01-14T14-23-30.207507.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T14-23-30.207507.parquet'
- config_name: results
data_files:
- split: 2024_01_14T14_23_30.207507
path:
- results_2024-01-14T14-23-30.207507.parquet
- split: latest
path:
- results_2024-01-14T14-23-30.207507.parquet
---
# Dataset Card for Evaluation run of PSanni/MPOMixtral-8x7B-Instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PSanni/MPOMixtral-8x7B-Instruct-v0.1](https://huggingface.co/PSanni/MPOMixtral-8x7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T14:23:30.207507](https://huggingface.co/datasets/open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1/blob/main/results_2024-01-14T14-23-30.207507.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7021378898937154,
"acc_stderr": 0.03050401556178155,
"acc_norm": 0.7057392541812927,
"acc_norm_stderr": 0.031097861836160572,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.665216588266765,
"mc2_stderr": 0.014619883028401507
},
"harness|arc:challenge|25": {
"acc": 0.6800341296928327,
"acc_stderr": 0.013631345807016193,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520764
},
"harness|hellaswag|10": {
"acc": 0.690300736904999,
"acc_stderr": 0.004614246282055377,
"acc_norm": 0.8795060744871539,
"acc_norm_stderr": 0.0032487292211528865
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.0399926287661772,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.0399926287661772
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.02619980880756193,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.02619980880756193
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6851063829787234,
"acc_stderr": 0.03036358219723817,
"acc_norm": 0.6851063829787234,
"acc_norm_stderr": 0.03036358219723817
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423294,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822033,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822033
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.024825909793343336,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.024825909793343336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.01438543285747646,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.01438543285747646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776823,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.029869605095316908,
"acc_norm": 0.4,
"acc_norm_stderr": 0.029869605095316908
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8715596330275229,
"acc_stderr": 0.014344977542914318,
"acc_norm": 0.8715596330275229,
"acc_norm_stderr": 0.014344977542914318
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025045,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924974,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999878,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999878
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.01665722942458631,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.01665722942458631
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8006535947712419,
"acc_stderr": 0.02287581699346408,
"acc_norm": 0.8006535947712419,
"acc_norm_stderr": 0.02287581699346408
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225153,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225153
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5443285528031291,
"acc_stderr": 0.012719949543032226,
"acc_norm": 0.5443285528031291,
"acc_norm_stderr": 0.012719949543032226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7867647058823529,
"acc_stderr": 0.024880971512294254,
"acc_norm": 0.7867647058823529,
"acc_norm_stderr": 0.024880971512294254
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904028,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904028
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166323,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.665216588266765,
"mc2_stderr": 0.014619883028401507
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498428
},
"harness|gsm8k|5": {
"acc": 0.5852918877937832,
"acc_stderr": 0.013570623842304504
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bigbio/n2c2_2010 |
---
language:
- en
bigbio_language:
- English
license: other
multilinguality: monolingual
bigbio_license_shortname: DUA
pretty_name: n2c2 2010 Concepts, Assertions, and Relations
homepage: https://portal.dbmi.hms.harvard.edu/projects/n2c2-nlp/
bigbio_pubmed: False
bigbio_public: False
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- RELATION_EXTRACTION
---
# Dataset Card for n2c2 2010 Concepts, Assertions, and Relations
## Dataset Description
- **Homepage:** https://portal.dbmi.hms.harvard.edu/projects/n2c2-nlp/
- **Pubmed:** False
- **Public:** False
- **Tasks:** NER,RE
The i2b2/VA corpus contained de-identified discharge summaries from Beth Israel
Deaconess Medical Center, Partners Healthcare, and University of Pittsburgh Medical
Center (UPMC). In addition, UPMC contributed de-identified progress notes to the
i2b2/VA corpus. This dataset contains the records from Beth Israel and Partners.
The 2010 i2b2/VA Workshop on Natural Language Processing Challenges for Clinical Records comprises three tasks:
1) a concept extraction task focused on the extraction of medical concepts from patient reports;
2) an assertion classification task focused on assigning assertion types for medical problem concepts;
3) a relation classification task focused on assigning relation types that hold between medical problems,
tests, and treatments.
i2b2 and the VA provided an annotated reference standard corpus for the three tasks.
Using this reference standard, 22 systems were developed for concept extraction,
21 for assertion classification, and 16 for relation classification.
## Citation Information
```
@article{DBLP:journals/jamia/UzunerSSD11,
author = {
Ozlem Uzuner and
Brett R. South and
Shuying Shen and
Scott L. DuVall
},
title = {2010 i2b2/VA challenge on concepts, assertions, and relations in clinical
text},
journal = {J. Am. Medical Informatics Assoc.},
volume = {18},
number = {5},
pages = {552--556},
year = {2011},
url = {https://doi.org/10.1136/amiajnl-2011-000203},
doi = {10.1136/amiajnl-2011-000203},
timestamp = {Mon, 11 May 2020 23:00:20 +0200},
biburl = {https://dblp.org/rec/journals/jamia/UzunerSSD11.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
heegyu/namuwiki | ---
license: cc-by-nc-sa-2.0
language:
- ko
language_creators:
- other
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
task_categories:
- other
---
# namu.wiki database dump
##
https://namu.wiki/ database dump 2022/03/01<br/>
- 867024 rows
- download size: 3GB
## Usage
```bash
pip install datasets
```
```python
from datasets import load_dataset
dataset = load_dataset("heegyu/namuwiki")
print(dataset["train"][0])
```
```
{'title': '!!아앗!!',
'text': '\n[목차]\n\n\'\'\'{{{+1 !!ああっと!!}}}\'\'\'\n\n== 개요 ==\n[[파일:3444050440.jpg|width=60%]]\n▲[[신 세계수의 미궁 2 파프니르기사|신 세계수의 미궁 2]]에서 뜬 !!아앗!!\n\n[[세계수의 미궁 시리즈]]에 전통으로 등장하는 대사. [[세계수의 미궁 2 제왕의 성배|2편]]부터 등장했으며 훌륭한 [[사망 플래그]]의 예시이다.\n\n세계수의 모험가들이 탐험하는 던전인 수해의 구석구석에는 채취/벌채/채굴 포인트가 있으며, 이를 위한 채집 스킬에 투자하면 제한된 채집 기회에서 보다 큰 이득을 챙길 수 있다. 그러나 분배할 수 있는 스킬 포인트는 한정되어 있기 때문에 채집 스킬에 투자하는 만큼 전투 스킬 레벨은 낮아지게 된다.[* 다만 채집 시스템은 신 세계수 시리즈의 그리모어 복제, 복합 채집 스킬인 야생의 감, 5편의 종족 특유 스킬, 크로스의 1레벨이 만렙인 채집 스킬 등으로 편의성이 점차 나아져서 채집 스킬 때문에 스킬 트리가 내려가는 일은 점점 줄어들었다.] !!아앗!!이 발생하는 과정을 요약하면 다음과 같다.\n\n 1. 채집용 캐릭터들로 이루어진 약한 파티(ex: [[레인저(세계수의 미궁 2)|레인저]] 5명)가 수해에 입장한다.\n 1. 필드 전투를 피해 채집 포인트에 도착한 후 열심히 아이템을 캐는 중에...\n 1. \'\'\'!!아앗!!\'\'\' ~~라플레시아가 나타났다!~~\n 이때 등장하는 것은 [[FOE(세계수의 미궁 시리즈)|FOE]]는 아니지만 \'\'\'훨씬 위층에 등장하는 강력한 필드 몬스터이며 선제 공격을 당하게 된다!\'\'\'\n 1. \'\'\'으앙 죽음\'\'\'(hage)\n\n여담으로 !!아앗!!의 유래는 1인칭 던전 크롤러의 원조 [[위저드리]]에서 함정을 건드렸을 때 나오는 대사 Oops!(おおっと!)라고 한다.\n\n== 각 작품에서의 모습 ==\n=== [[세계수의 미궁 2 제왕의 성배]] ===\n!!아앗!!의 악랄함은 첫 등장한 작품이자 시리즈 중에서도 불친절하기로 정평이 난 2편이 절정이었다. 그야말로 위의 !!아앗!! 시퀀스 그대로, 묻지도 따지지도 않고 채집할 때마다 일정 확률로 \'\'\'강제로\'\'\' 전투에 돌입해야 했다. 게다가 이럴 때 쓰라고 있는 레인저의 스킬 \'위험 감지(중간 확률로 적의 선제 공격을 무효화)\'는 정작 작동하지 않는다!\n\n참고로 2편에서 채집 도중 !!아앗!!이 뜰 확률은 [[http://www.atlusnet.jp/topic/detail/910|고작 1%다.]] [[던파확률의 법칙|낮아 보이는 확률이어도 플레이 중 한 번이라도 일어나는 것]]을 경험하는 체감 확률을 고려하여 확률을 설정한다고.\n\n=== [[세계수의 미궁 3 성해의 내방자]] ===\n다행히 채집 중 낮은 확률로 "좋은 아이템을 얻을 수 있을 것 같지만... 주변에서 몬스터들의 기척이 느껴진다."는 메시지가 뜨고 이때 운이 좋으면 레어 아이템을 얻을 수 있지만 반대의 경우 적과 싸우게 되는 것으로 조정되었다.\n\n=== [[세계수의 미궁 4 전승의 거신]] ===\n기본적인 것은 3편과 같지만, 4편에서는 움직이지 않고 채집할 때도 턴이 경과하도록 조정되었기 때문에 주변에 있는 FOE를 잊고 채집에 몰두하다가 FOE와 부딪히면 FOE 버전 !!아앗!!이 뜬다. 그리고 난이도 CASUAL로 플레이시, FOE로 인한 !!아앗!!을 제외하면 절대로 발생하지 않는다.\n\n=== [[신 세계수의 미궁 밀레니엄의 소녀|신 세계수의]] [[신 세계수의 미궁 2 파프니르기사|미궁 시리즈]] ===\n채집 방식이 한 턴으로 끝나는 구조[* 채집으로 한 번 아이템을 획득하면 "다시, (채집 스킬)에 의해..."가 뜨면서 한꺼번에 획득되는 구조.]로 바뀐 덕분인지 강제 조우로 다시 회귀해버렸다(...). 그나마 위험 감지 먹통과 같은 버그성 난점들은 수정되었다. 그 이후에 나온 [[세계수의 미궁 5 오랜 신화의 끝]]과 시리즈의 집대성 작품이자 3DS 마지막 작품인 [[세계수의 미궁 X]]도 마찬가지.\n\n=== [[세계수의 미궁 X]] ===\n본작의 채집은 신 세계수 시리즈와 같은 매커니즘이라 굳이 언급할 필요는 없으나, 퀘스트중에 2편의 !!아앗!! 시퀀스를 재현하면서 \'\'\'라플레시아\'\'\'가 등장하는 퀘스트가 존재한다.(...) 깨알같이 시스템 메세지 창이 아니라 대화창을 이용해서 완벽 재현한 것이 포인트.\n\n=== [[페르소나 Q 섀도우 오브 더 래버린스]] ===\n세계수 시스템을 기반으로 한 [[페르소나 시리즈]]와의 콜라보 작품인 페르소나 Q에서도 등장한다. 3, 4편과 같이 파워 스폿에서 채집 도중 메시지가 뜨며, 실패하면 파티에 참가하고 있는 멤버 중 한 명의 [[http://nico.ms/sm25683358|!!아앗!! 하는 음성]] ~~또는 [[코로마루|개소리]]~~과 함께 그 던전의 \'강적\'인 거대 [[섀도(페르소나 시리즈)|섀도우]]가 나타난다.\n\n그러나 내비 전용 스킬인 뱀눈 노려보기(위험 감지와 같은 효과)와 채집 보조 스킬은 파티의 전투력에 전혀 지장을 주지 않으며, \'대안심\'을 달면 거의 볼 일이 없어져서 초중반 이후에는 존재감이 급격히 줄어든다.\n[[분류:세계수의 미궁 시리즈]]',
'contributors': '110.46.34.123,kirby10,max0243,218.54.117.149,ruby3141,121.165.63.239,iviyuki,1.229.200.194,anatra95,kiri47,175.127.134.2,nickchaos71,chkong1998,kiwitree2,namubot,huwieblusnow',
'namespace': ''}
``` |
autoevaluate/autoeval-eval-kmfoda__booksum-kmfoda__booksum-61a81c-1518854985 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP15
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP15
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
ASCCCCCCCC/bill | ---
license: apache-2.0
---
|
irds/mmarco_v2_pt_train | ---
pretty_name: '`mmarco/v2/pt/train`'
viewer: false
source_datasets: ['irds/mmarco_v2_pt']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/pt/train`
The `mmarco/v2/pt/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/pt/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=808,731
- `qrels`: (relevance assessments); count=532,761
- `docpairs`; count=39,780,811
- For `docs`, use [`irds/mmarco_v2_pt`](https://huggingface.co/datasets/irds/mmarco_v2_pt)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_v2_pt_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_v2_pt_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
docpairs = load_dataset('irds/mmarco_v2_pt_train', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
m-ric/transformers_documentation_en | ---
license: apache-2.0
---
|
joey234/mmlu-high_school_psychology-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 23104.963302752294
num_examples: 79
download_size: 23403
dataset_size: 23104.963302752294
---
# Dataset Card for "mmlu-high_school_psychology-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mvasiliniuc/iva-swift-codeint-clean-valid | ---
annotations_creators:
- crowdsourced
license: other
language_creators:
- crowdsourced
language:
- code
task_categories:
- text-generation
tags:
- code, ios, native iOS development, curated, validation
size_categories:
- 10K<n<100K
source_datasets: []
pretty_name: iva-swift-codeint-clean-valid
task_ids:
- language-modeling
---
# IVA Swift GitHub Code Dataset - Curated - Validation
## Dataset Description
This is the curated valid split of IVA Swift dataset extracted from GitHub.
It contains curated Swift files gathered with the purpose to train & validate a code generation model.
The dataset only contains a valid split.
For validation and unspliced versions, please check the following links:
* Clean Version Unsliced: https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint-clean
* Clean Version Train: https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint-clean-train
Information about dataset structure, data involved, licenses, and standard Dataset Card information is available that applies to this dataset also.
# Considerations for Using the Data
The dataset comprises source code from various repositories, potentially containing harmful or biased code,
along with sensitive information such as passwords or usernames. |
andreiwww/test_tut | ---
license: other
---
|
open-llm-leaderboard/details_Kukedlc__NeuralMaths-Experiment-7b | ---
pretty_name: Evaluation run of Kukedlc/NeuralMaths-Experiment-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/NeuralMaths-Experiment-7b](https://huggingface.co/Kukedlc/NeuralMaths-Experiment-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__NeuralMaths-Experiment-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:48:56.204016](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralMaths-Experiment-7b/blob/main/results_2024-03-29T21-48-56.204016.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6556680183910659,\n\
\ \"acc_stderr\": 0.03185357048069436,\n \"acc_norm\": 0.6547192071330743,\n\
\ \"acc_norm_stderr\": 0.03252307813134486,\n \"mc1\": 0.47123623011015914,\n\
\ \"mc1_stderr\": 0.01747451384852552,\n \"mc2\": 0.6382648558896807,\n\
\ \"mc2_stderr\": 0.015184759114654476\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.013760988200880538,\n\
\ \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6942840071698865,\n\
\ \"acc_stderr\": 0.004597684609707823,\n \"acc_norm\": 0.8748257319259112,\n\
\ \"acc_norm_stderr\": 0.0033024011069263223\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297794,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297794\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700486,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700486\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.02415222596280158,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.02415222596280158\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.01326534626132379,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.01326534626132379\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045702,\n\
\ \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045702\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"\
acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886894,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886894\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47123623011015914,\n\
\ \"mc1_stderr\": 0.01747451384852552,\n \"mc2\": 0.6382648558896807,\n\
\ \"mc2_stderr\": 0.015184759114654476\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.01068417922770617\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7520849128127369,\n \
\ \"acc_stderr\": 0.011893980214826166\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/NeuralMaths-Experiment-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-48-56.204016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-48-56.204016.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- '**/details_harness|winogrande|5_2024-03-29T21-48-56.204016.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-48-56.204016.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_48_56.204016
path:
- results_2024-03-29T21-48-56.204016.parquet
- split: latest
path:
- results_2024-03-29T21-48-56.204016.parquet
---
# Dataset Card for Evaluation run of Kukedlc/NeuralMaths-Experiment-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/NeuralMaths-Experiment-7b](https://huggingface.co/Kukedlc/NeuralMaths-Experiment-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__NeuralMaths-Experiment-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:48:56.204016](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralMaths-Experiment-7b/blob/main/results_2024-03-29T21-48-56.204016.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6556680183910659,
"acc_stderr": 0.03185357048069436,
"acc_norm": 0.6547192071330743,
"acc_norm_stderr": 0.03252307813134486,
"mc1": 0.47123623011015914,
"mc1_stderr": 0.01747451384852552,
"mc2": 0.6382648558896807,
"mc2_stderr": 0.015184759114654476
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.013760988200880538,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.6942840071698865,
"acc_stderr": 0.004597684609707823,
"acc_norm": 0.8748257319259112,
"acc_norm_stderr": 0.0033024011069263223
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297794,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700486,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700486
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.02415222596280158,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.02415222596280158
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.01326534626132379,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.01326534626132379
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.4,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045702,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045702
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886894,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886894
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47123623011015914,
"mc1_stderr": 0.01747451384852552,
"mc2": 0.6382648558896807,
"mc2_stderr": 0.015184759114654476
},
"harness|winogrande|5": {
"acc": 0.824782951854775,
"acc_stderr": 0.01068417922770617
},
"harness|gsm8k|5": {
"acc": 0.7520849128127369,
"acc_stderr": 0.011893980214826166
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nlplabtdtu/edu_data_with_tag | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
- name: url
dtype: string
- name: uni
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 2035056554
num_examples: 213847
download_size: 775920816
dataset_size: 2035056554
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_151 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 973366352.0
num_examples: 191156
download_size: 991749867
dataset_size: 973366352.0
---
# Dataset Card for "chunk_151"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Multi-race_7_Expressions_Recognition_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Multi-race_7_Expressions_Recognition_Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/973?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
25,998 People Multi-race 7 Expressions Recognition Data. The data includes male and female. The age distribution ranges from child to the elderly, the young people and the middle aged are the majorities. For each person, 7 images were collected. The data diversity includes different facial postures, different expressions, different light conditions and different scenes. The data can be used for tasks such as face expression recognition.
For more details, please refer to the link: https://www.nexdata.ai/datasets/973?source=Huggingface
### Supported Tasks and Leaderboards
face-detection, computer-vision: The dataset can be used to train a model for face detection.
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
databoks-irfan/unlabeled-twitter-comments | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5083473
num_examples: 51943
- name: test
num_bytes: 265914
num_examples: 12986
download_size: 3435481
dataset_size: 5349387
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
joey234/mmlu-astronomy-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 9251
num_examples: 5
- name: test
num_bytes: 1792879
num_examples: 152
download_size: 146597
dataset_size: 1802130
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-astronomy-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Boese0601/CS535_2024spr | ---
license: mit
---
|
ohmno2/webui-config | ---
license: mit
---
|
phiyodr/flickr30k_long | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: sentids
dtype: string
- name: split
dtype: string
- name: img_id
dtype: string
- name: filename
dtype: string
splits:
- name: test
num_bytes: 21272586635.95
num_examples: 155070
download_size: 4485309202
dataset_size: 21272586635.95
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_144 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1096431272.0
num_examples: 213646
download_size: 1123334163
dataset_size: 1096431272.0
---
# Dataset Card for "chunk_144"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phanvancongthanh/data_deduplicated_part05 | ---
dataset_info:
features:
- name: smiles
dtype: string
splits:
- name: train
num_bytes: 3402771276
num_examples: 70962925
download_size: 1792627444
dataset_size: 3402771276
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_deduplicated_part05"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
erhwenkuo/wikinews-zhtw | ---
dataset_info:
config_name: '20231001'
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 13647957
num_examples: 9827
download_size: 8803739
dataset_size: 13647957
configs:
- config_name: '20231001'
data_files:
- split: train
path: 20231001/train-*
license: cc-by-sa-3.0
task_categories:
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
---
# Dataset Card for "wikinews-zhtw"
維基新聞(英文:Wikinews)是由一群志願者、即民間記者運營的網路媒體。同時是一個自由內容的維基,屬維基媒體計劃項目,由維基媒體基金會負責運營。維基新聞通過協作新聞學的工作模式去運行,同時亦努力通過中性的觀點報導新聞,包括原創一手獨家報道和採訪。
這個數據集是根據 Wikipedia dumps (https://dumps.wikimedia.org/) 裡頭 `zhwikinews` 的中文下載檔案來建構的。每個範例都包含一篇完整的維基新聞文章的內容,並經過清理以去除不需要的部分。
- **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org)
- **zhwiki 下載點:** [https://dumps.wikimedia.org/zhwikinews](https://dumps.wikimedia.org/zhwikinews/)
## 數據 Dump 版本
由於維基百科數據集定期會進行網站數據拋轉,在 `2023/10/10` 的時間點去查看時會有下列的數據可供下載:
|數據 Dump 目錄|拋轉時間點|
|-------------|--------|
|`20230520/`|01-Jul-2023 09:28|
|`20230601/`|20-Jul-2023 09:41|
|`20230620/`|01-Aug-2023 09:35|
|`20230701/`|20-Aug-2023 09:49|
|`20230720/`|01-Sep-2023 09:35|
|`20230801/`|20-Sep-2023 09:46|
|`20230820/`|01-Oct-2023 09:42|
|`20230901/`|02-Sep-2023 14:47|
|`20230920/`|21-Sep-2023 14:41|
|`20231001/`|10-Oct-2023 03:50|
|`latest/`|10-Oct-2023 03:50|
本數據集會定期去取得最近有明確的日期來進行下載與清理,便於驗證與使用。
## 數據下載清理
1. 下載 zhwiki 的 data dump 檔案
2. 使用 [WikiExtractor](https://github.com/attardi/wikiextractor) 套件來進行文件內容萃取
3. 進行數據清理并轉換成 jsonl 格式檔案
4. 使用 Huggingface [Datasets](https://pypi.org/project/datasets/) 套件來載入 jsonl 并上傳至 Huggingface Hub
## 資料集結構
範例如下:
{'id': '35',
'url': 'https://zh.wikinews.org/wiki?curid=35',
'title': 'EDWIN與CUELLO遭統一獅隊解約',
'text': '曾經打過中國棒球聯賽的兩位外援球員EDWIN(臺譯:愛力)與CUELLO(臺譯:阿-{A|裡}-),昨天傳出...'
}
## 資料欄位
所有配置中的資料欄位都是相同的:
- `id (str)`: 文章的 ID。
- `url (str)`: 文章的 URL。
- `title (str)`: 文章的標題。
- `text (str)`: 文章的文字內容。
## 使用
```python
from datasets import load_dataset
# 請在第二個參數去指定要使用的數據 dump 的日期
load_dataset("erhwenkuo/wikinews-zhtw", "20231001")
```
## 許可資訊
維基百科的大部分文章內容及其許多圖像均根據 `Creative Commons Attribution-ShareAlike 3.0 Unported License (CC BY-SA)` 和 `GNU Free Documentation License (GFDL)` 共同授權。
## Citation
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
``` |
AdapterOcean/data-standardized_cluster_2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 80109314
num_examples: 7805
download_size: 22894344
dataset_size: 80109314
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pheid/fashion_image_caption-100-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22820471.0
num_examples: 100
download_size: 22820373
dataset_size: 22820471.0
---
# Dataset Card for "fashion_image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
naorm/caption-eval-screen2words-pix2struct | ---
dataset_info:
features:
- name: model_name
dtype: string
- name: metric_name
dtype: string
- name: value
dtype: float64
splits:
- name: train
num_bytes: 577
num_examples: 14
download_size: 2010
dataset_size: 577
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BarraHome/rezephyr_merged_4bit | ---
license: mit
---
|
wanyu/IteraTeR_full_doc | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
pretty_name: IteraTeR_full_doc
language_bcp47:
- en-US
tags:
- conditional-text-generation
- text-editing
---
Paper: [Understanding Iterative Revision from Human-Written Text](https://arxiv.org/abs/2203.03802)
Authors: Wanyu Du, Vipul Raheja, Dhruv Kumar, Zae Myung Kim, Melissa Lopez, Dongyeop Kang
Github repo: https://github.com/vipulraheja/IteraTeR
|
irds/antique_train_split200-train | ---
pretty_name: '`antique/train/split200-train`'
viewer: false
source_datasets: ['irds/antique']
task_categories:
- text-retrieval
---
# Dataset Card for `antique/train/split200-train`
The `antique/train/split200-train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/antique#antique/train/split200-train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=2,226
- `qrels`: (relevance assessments); count=25,229
- For `docs`, use [`irds/antique`](https://huggingface.co/datasets/irds/antique)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/antique_train_split200-train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/antique_train_split200-train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Hashemi2020Antique,
title={ANTIQUE: A Non-Factoid Question Answering Benchmark},
author={Helia Hashemi and Mohammad Aliannejadi and Hamed Zamani and Bruce Croft},
booktitle={ECIR},
year={2020}
}
```
|
heliosprime/twitter_dataset_1712946704 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 7414
num_examples: 17
download_size: 8303
dataset_size: 7414
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712946704"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Basilisk1897/Image_Folder | ---
license: openrail
size_categories:
- n<1K
--- |
lmg-anon/VNTL | ---
language:
- en
- ja
license: apache-2.0
task_categories:
- translation
pretty_name: VNTL
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 87818180
num_examples: 1
download_size: 41285782
dataset_size: 87818180
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- not-for-all-audiences
---
|
tortuecookie/testing | ---
license: mit
---
|
wu981526092/MGSD_V2 | ---
license: mit
---
|
nagyigergo/gyurcsany | ---
license: unknown
---
|
Arnav2612/Proteins | ---
license: mit
---
|
joey234/mmlu-high_school_chemistry-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 91375
num_examples: 203
download_size: 51292
dataset_size: 91375
---
# Dataset Card for "mmlu-high_school_chemistry-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Minata/1024src_fm_fc_ms_ff_method2testcases_v0 | ---
dataset_info:
features:
- name: method2testcases
dtype: string
splits:
- name: train
num_bytes: 710933017.7210523
num_examples: 269325
- name: test
num_bytes: 188946470.33868438
num_examples: 69226
download_size: 130672425
dataset_size: 899879488.0597367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Agress0r/bianca_veritas_dataset | ---
license: cc-by-4.0
---
|
kxly/princess_tutu | ---
language:
- en
license: creativeml-openrail-m
thumbnail: >-
https://huggingface.co/datasets/kxly/princess_tutu/blob/main/princess_tutu_showcase.png
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
pretty_name: Princess Tutu
---
# Character Embedding - Princess Tutu/Ahiru

## Usage
To use an embedding, download the .pt file and place it in "\stable-diffusion-webui\embeddings".
In your prompt, write ```"princess_tutu-6500"```.
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claim no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
sezer12138/ade20k_image_classification | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
'10': '10'
'11': '11'
'12': '12'
'13': '13'
'14': '14'
'15': '15'
'16': '16'
'17': '17'
'18': '18'
'19': '19'
'20': '20'
'21': '21'
'22': '22'
'23': '23'
'24': '24'
'25': '25'
'26': '26'
'27': '27'
'28': '28'
'29': '29'
'30': '30'
'31': '31'
'32': '32'
'33': '33'
'34': '34'
'35': '35'
'36': '36'
'37': '37'
'38': '38'
'39': '39'
'40': '40'
'41': '41'
'42': '42'
'43': '43'
'44': '44'
'45': '45'
'46': '46'
'47': '47'
'48': '48'
'49': '49'
'50': '50'
'51': '51'
'52': '52'
'53': '53'
'54': '54'
'55': '55'
'56': '56'
'57': '57'
'58': '58'
'59': '59'
'60': '60'
'61': '61'
'62': '62'
'63': '63'
'64': '64'
'65': '65'
'66': '66'
'67': '67'
'68': '68'
'69': '69'
'70': '70'
'71': '71'
'72': '72'
'73': '73'
'74': '74'
'75': '75'
'76': '76'
'77': '77'
'78': '78'
'79': '79'
'80': '80'
'81': '81'
'82': '82'
'83': '83'
'84': '84'
'85': '85'
'86': '86'
'87': '87'
'88': '88'
'89': '89'
'90': '90'
'91': '91'
'92': '92'
'93': '93'
'94': '94'
'95': '95'
'96': '96'
'97': '97'
'98': '98'
'99': '99'
'100': '100'
'101': '101'
'102': '102'
'103': '103'
'104': '104'
'105': '105'
'106': '106'
'107': '107'
'108': '108'
'109': '109'
'110': '110'
'111': '111'
'112': '112'
'113': '113'
'114': '114'
'115': '115'
'116': '116'
'117': '117'
'118': '118'
'119': '119'
'120': '120'
'121': '121'
'122': '122'
'123': '123'
'124': '124'
'125': '125'
'126': '126'
'127': '127'
'128': '128'
'129': '129'
'130': '130'
'131': '131'
'132': '132'
'133': '133'
'134': '134'
'135': '135'
'136': '136'
'137': '137'
'138': '138'
'139': '139'
'140': '140'
'141': '141'
'142': '142'
'143': '143'
'144': '144'
'145': '145'
'146': '146'
'147': '147'
'148': '148'
'149': '149'
'150': '150'
'151': '151'
'152': '152'
'153': '153'
'154': '154'
'155': '155'
'156': '156'
'157': '157'
'158': '158'
'159': '159'
'160': '160'
'161': '161'
'162': '162'
'163': '163'
'164': '164'
'165': '165'
'166': '166'
'167': '167'
'168': '168'
'169': '169'
'170': '170'
'171': '171'
'172': '172'
'173': '173'
'174': '174'
'175': '175'
'176': '176'
'177': '177'
'178': '178'
'179': '179'
'180': '180'
'181': '181'
'182': '182'
'183': '183'
'184': '184'
'185': '185'
'186': '186'
'187': '187'
'188': '188'
'189': '189'
'190': '190'
'191': '191'
'192': '192'
'193': '193'
'194': '194'
'195': '195'
'196': '196'
'197': '197'
'198': '198'
'199': '199'
'200': '200'
'201': '201'
'202': '202'
'203': '203'
'204': '204'
'205': '205'
'206': '206'
'207': '207'
'208': '208'
'209': '209'
'210': '210'
'211': '211'
'212': '212'
'213': '213'
'214': '214'
'215': '215'
'216': '216'
'217': '217'
'218': '218'
'219': '219'
'220': '220'
'221': '221'
'222': '222'
'223': '223'
'224': '224'
'225': '225'
'226': '226'
'227': '227'
'228': '228'
'229': '229'
'230': '230'
'231': '231'
'232': '232'
'233': '233'
'234': '234'
'235': '235'
'236': '236'
'237': '237'
'238': '238'
'239': '239'
'240': '240'
'241': '241'
'242': '242'
'243': '243'
'244': '244'
'245': '245'
'246': '246'
'247': '247'
'248': '248'
'249': '249'
'250': '250'
'251': '251'
'252': '252'
'253': '253'
'254': '254'
'255': '255'
'256': '256'
'257': '257'
'258': '258'
'259': '259'
'260': '260'
'261': '261'
'262': '262'
'263': '263'
'264': '264'
'265': '265'
'266': '266'
'267': '267'
'268': '268'
'269': '269'
'270': '270'
'271': '271'
'272': '272'
'273': '273'
'274': '274'
'275': '275'
'276': '276'
'277': '277'
'278': '278'
'279': '279'
'280': '280'
'281': '281'
'282': '282'
'283': '283'
'284': '284'
'285': '285'
'286': '286'
'287': '287'
'288': '288'
'289': '289'
'290': '290'
'291': '291'
'292': '292'
'293': '293'
'294': '294'
'295': '295'
'296': '296'
'297': '297'
'298': '298'
'299': '299'
'300': '300'
'301': '301'
'302': '302'
'303': '303'
'304': '304'
'305': '305'
'306': '306'
'307': '307'
'308': '308'
'309': '309'
'310': '310'
'311': '311'
'312': '312'
'313': '313'
'314': '314'
'315': '315'
'316': '316'
'317': '317'
'318': '318'
'319': '319'
'320': '320'
'321': '321'
'322': '322'
'323': '323'
'324': '324'
'325': '325'
'326': '326'
'327': '327'
'328': '328'
'329': '329'
'330': '330'
'331': '331'
'332': '332'
'333': '333'
'334': '334'
'335': '335'
'336': '336'
'337': '337'
'338': '338'
'339': '339'
'340': '340'
'341': '341'
'342': '342'
'343': '343'
'344': '344'
'345': '345'
'346': '346'
'347': '347'
'348': '348'
'349': '349'
'350': '350'
'351': '351'
'352': '352'
'353': '353'
'354': '354'
'355': '355'
'356': '356'
'357': '357'
'358': '358'
'359': '359'
'360': '360'
'361': '361'
'362': '362'
'363': '363'
'364': '364'
'365': '365'
'366': '366'
'367': '367'
'368': '368'
'369': '369'
'370': '370'
'371': '371'
'372': '372'
'373': '373'
'374': '374'
'375': '375'
'376': '376'
'377': '377'
'378': '378'
'379': '379'
'380': '380'
'381': '381'
'382': '382'
'383': '383'
'384': '384'
'385': '385'
'386': '386'
'387': '387'
'388': '388'
'389': '389'
'390': '390'
'391': '391'
'392': '392'
'393': '393'
'394': '394'
'395': '395'
'396': '396'
'397': '397'
'398': '398'
'399': '399'
'400': '400'
'401': '401'
'402': '402'
'403': '403'
'404': '404'
'405': '405'
'406': '406'
'407': '407'
'408': '408'
'409': '409'
'410': '410'
'411': '411'
'412': '412'
'413': '413'
'414': '414'
'415': '415'
'416': '416'
'417': '417'
'418': '418'
'419': '419'
'420': '420'
'421': '421'
'422': '422'
'423': '423'
'424': '424'
'425': '425'
'426': '426'
'427': '427'
'428': '428'
'429': '429'
'430': '430'
'431': '431'
'432': '432'
'433': '433'
'434': '434'
'435': '435'
'436': '436'
'437': '437'
'438': '438'
'439': '439'
'440': '440'
'441': '441'
'442': '442'
'443': '443'
'444': '444'
'445': '445'
'446': '446'
'447': '447'
'448': '448'
'449': '449'
'450': '450'
'451': '451'
'452': '452'
'453': '453'
'454': '454'
'455': '455'
'456': '456'
'457': '457'
'458': '458'
'459': '459'
'460': '460'
'461': '461'
'462': '462'
'463': '463'
'464': '464'
'465': '465'
'466': '466'
'467': '467'
'468': '468'
'469': '469'
'470': '470'
'471': '471'
'472': '472'
'473': '473'
'474': '474'
'475': '475'
'476': '476'
'477': '477'
'478': '478'
'479': '479'
'480': '480'
'481': '481'
'482': '482'
'483': '483'
'484': '484'
'485': '485'
'486': '486'
'487': '487'
'488': '488'
'489': '489'
'490': '490'
'491': '491'
'492': '492'
'493': '493'
'494': '494'
'495': '495'
'496': '496'
'497': '497'
'498': '498'
'499': '499'
'500': '500'
'501': '501'
'502': '502'
'503': '503'
'504': '504'
'505': '505'
'506': '506'
'507': '507'
'508': '508'
'509': '509'
'510': '510'
'511': '511'
'512': '512'
'513': '513'
'514': '514'
'515': '515'
'516': '516'
'517': '517'
'518': '518'
'519': '519'
'520': '520'
'521': '521'
'522': '522'
'523': '523'
'524': '524'
'525': '525'
'526': '526'
'527': '527'
'528': '528'
'529': '529'
'530': '530'
'531': '531'
'532': '532'
'533': '533'
'534': '534'
'535': '535'
'536': '536'
'537': '537'
'538': '538'
'539': '539'
'540': '540'
'541': '541'
'542': '542'
'543': '543'
'544': '544'
'545': '545'
'546': '546'
'547': '547'
'548': '548'
'549': '549'
'550': '550'
'551': '551'
'552': '552'
'553': '553'
'554': '554'
'555': '555'
'556': '556'
'557': '557'
'558': '558'
'559': '559'
'560': '560'
'561': '561'
'562': '562'
'563': '563'
'564': '564'
'565': '565'
'566': '566'
'567': '567'
'568': '568'
'569': '569'
'570': '570'
'571': '571'
'572': '572'
'573': '573'
'574': '574'
'575': '575'
'576': '576'
'577': '577'
'578': '578'
'579': '579'
'580': '580'
'581': '581'
'582': '582'
'583': '583'
'584': '584'
'585': '585'
'586': '586'
'587': '587'
'588': '588'
'589': '589'
'590': '590'
'591': '591'
'592': '592'
'593': '593'
'594': '594'
'595': '595'
'596': '596'
'597': '597'
'598': '598'
'599': '599'
'600': '600'
'601': '601'
'602': '602'
'603': '603'
'604': '604'
'605': '605'
'606': '606'
'607': '607'
'608': '608'
'609': '609'
'610': '610'
'611': '611'
'612': '612'
'613': '613'
'614': '614'
'615': '615'
'616': '616'
'617': '617'
'618': '618'
'619': '619'
'620': '620'
'621': '621'
'622': '622'
'623': '623'
'624': '624'
'625': '625'
'626': '626'
'627': '627'
'628': '628'
'629': '629'
'630': '630'
'631': '631'
'632': '632'
'633': '633'
'634': '634'
'635': '635'
'636': '636'
'637': '637'
'638': '638'
'639': '639'
'640': '640'
'641': '641'
'642': '642'
'643': '643'
'644': '644'
'645': '645'
'646': '646'
'647': '647'
'648': '648'
'649': '649'
'650': '650'
'651': '651'
'652': '652'
'653': '653'
'654': '654'
'655': '655'
'656': '656'
'657': '657'
'658': '658'
'659': '659'
'660': '660'
'661': '661'
'662': '662'
'663': '663'
'664': '664'
'665': '665'
'666': '666'
'667': '667'
'668': '668'
'669': '669'
'670': '670'
'671': '671'
'672': '672'
'673': '673'
'674': '674'
'675': '675'
'676': '676'
'677': '677'
'678': '678'
'679': '679'
'680': '680'
'681': '681'
'682': '682'
'683': '683'
'684': '684'
'685': '685'
'686': '686'
'687': '687'
'688': '688'
'689': '689'
'690': '690'
'691': '691'
'692': '692'
'693': '693'
'694': '694'
'695': '695'
'696': '696'
'697': '697'
'698': '698'
'699': '699'
'700': '700'
'701': '701'
'702': '702'
'703': '703'
'704': '704'
'705': '705'
'706': '706'
'707': '707'
'708': '708'
'709': '709'
'710': '710'
'711': '711'
'712': '712'
'713': '713'
'714': '714'
'715': '715'
'716': '716'
'717': '717'
'718': '718'
'719': '719'
'720': '720'
'721': '721'
'722': '722'
'723': '723'
'724': '724'
'725': '725'
'726': '726'
'727': '727'
'728': '728'
'729': '729'
'730': '730'
'731': '731'
'732': '732'
'733': '733'
'734': '734'
'735': '735'
'736': '736'
'737': '737'
'738': '738'
'739': '739'
'740': '740'
'741': '741'
'742': '742'
'743': '743'
'744': '744'
'745': '745'
'746': '746'
'747': '747'
'748': '748'
'749': '749'
'750': '750'
'751': '751'
'752': '752'
'753': '753'
'754': '754'
'755': '755'
'756': '756'
'757': '757'
'758': '758'
'759': '759'
'760': '760'
'761': '761'
'762': '762'
'763': '763'
'764': '764'
'765': '765'
'766': '766'
'767': '767'
'768': '768'
'769': '769'
'770': '770'
'771': '771'
'772': '772'
'773': '773'
'774': '774'
'775': '775'
'776': '776'
'777': '777'
'778': '778'
'779': '779'
'780': '780'
'781': '781'
'782': '782'
'783': '783'
'784': '784'
'785': '785'
'786': '786'
'787': '787'
'788': '788'
'789': '789'
'790': '790'
'791': '791'
'792': '792'
'793': '793'
'794': '794'
'795': '795'
'796': '796'
'797': '797'
'798': '798'
'799': '799'
'800': '800'
'801': '801'
'802': '802'
'803': '803'
'804': '804'
'805': '805'
'806': '806'
'807': '807'
'808': '808'
'809': '809'
'810': '810'
'811': '811'
'812': '812'
'813': '813'
'814': '814'
'815': '815'
'816': '816'
'817': '817'
'818': '818'
'819': '819'
'820': '820'
'821': '821'
'822': '822'
'823': '823'
'824': '824'
'825': '825'
'826': '826'
'827': '827'
'828': '828'
'829': '829'
'830': '830'
'831': '831'
'832': '832'
'833': '833'
'834': '834'
'835': '835'
'836': '836'
'837': '837'
'838': '838'
'839': '839'
'840': '840'
'841': '841'
'842': '842'
'843': '843'
'844': '844'
'845': '845'
'846': '846'
'847': '847'
'848': '848'
'849': '849'
'850': '850'
'851': '851'
'852': '852'
'853': '853'
'854': '854'
'855': '855'
'856': '856'
'857': '857'
'858': '858'
'859': '859'
'860': '860'
'861': '861'
'862': '862'
'863': '863'
'864': '864'
'865': '865'
'866': '866'
'867': '867'
'868': '868'
'869': '869'
'870': '870'
'871': '871'
'872': '872'
'873': '873'
'874': '874'
'875': '875'
'876': '876'
'877': '877'
'878': '878'
'879': '879'
'880': '880'
'881': '881'
'882': '882'
'883': '883'
'884': '884'
'885': '885'
'886': '886'
'887': '887'
'888': '888'
'889': '889'
'890': '890'
'891': '891'
'892': '892'
'893': '893'
'894': '894'
'895': '895'
'896': '896'
'897': '897'
'898': '898'
'899': '899'
'900': '900'
'901': '901'
'902': '902'
'903': '903'
'904': '904'
'905': '905'
'906': '906'
'907': '907'
'908': '908'
'909': '909'
'910': '910'
'911': '911'
'912': '912'
'913': '913'
'914': '914'
'915': '915'
'916': '916'
'917': '917'
'918': '918'
'919': '919'
'920': '920'
'921': '921'
'922': '922'
'923': '923'
'924': '924'
'925': '925'
'926': '926'
'927': '927'
'928': '928'
'929': '929'
'930': '930'
'931': '931'
'932': '932'
'933': '933'
'934': '934'
'935': '935'
'936': '936'
'937': '937'
'938': '938'
'939': '939'
'940': '940'
'941': '941'
'942': '942'
'943': '943'
'944': '944'
'945': '945'
'946': '946'
'947': '947'
'948': '948'
'949': '949'
'950': '950'
'951': '951'
'952': '952'
'953': '953'
'954': '954'
'955': '955'
'956': '956'
'957': '957'
'958': '958'
'959': '959'
'960': '960'
'961': '961'
'962': '962'
'963': '963'
'964': '964'
'965': '965'
'966': '966'
'967': '967'
'968': '968'
'969': '969'
'970': '970'
'971': '971'
'972': '972'
'973': '973'
'974': '974'
'975': '975'
'976': '976'
'977': '977'
'978': '978'
'979': '979'
'980': '980'
'981': '981'
'982': '982'
'983': '983'
'984': '984'
'985': '985'
'986': '986'
'987': '987'
'988': '988'
'989': '989'
'990': '990'
'991': '991'
'992': '992'
'993': '993'
'994': '994'
'995': '995'
'996': '996'
'997': '997'
'998': '998'
'999': '999'
'1000': '1000'
'1001': '1001'
'1002': '1002'
'1003': '1003'
'1004': '1004'
'1005': '1005'
'1006': '1006'
'1007': '1007'
'1008': '1008'
'1009': '1009'
'1010': '1010'
'1011': '1011'
'1012': '1012'
'1013': '1013'
'1014': '1014'
'1015': '1015'
'1016': '1016'
'1017': '1017'
'1018': '1018'
'1019': '1019'
'1020': '1020'
'1021': '1021'
'1022': '1022'
'1023': '1023'
'1024': '1024'
'1025': '1025'
'1026': '1026'
'1027': '1027'
'1028': '1028'
'1029': '1029'
'1030': '1030'
'1031': '1031'
'1032': '1032'
'1033': '1033'
'1034': '1034'
'1035': '1035'
'1036': '1036'
'1037': '1037'
'1038': '1038'
'1039': '1039'
'1040': '1040'
'1041': '1041'
'1042': '1042'
'1043': '1043'
'1044': '1044'
'1045': '1045'
'1046': '1046'
'1047': '1047'
'1048': '1048'
'1049': '1049'
'1050': '1050'
'1051': '1051'
'1052': '1052'
'1053': '1053'
'1054': '1054'
splits:
- name: train
num_bytes: 975588640.45
num_examples: 20210
- name: val
num_bytes: 81667102.0
num_examples: 2000
download_size: 873806425
dataset_size: 1057255742.45
---
# Dataset Card for "ade20k_image_classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Anonymous2023/dataset2python | ---
license: mit
---
|
Nexdata/Multi-pose_Faces_Data | ---
license: cc-by-nc-4.0
---
# Description
3,919 People Multi-pose Faces Data, 24 images and 9 videos per person. The collection environment includes indoor and outdoor scenes. This data can be used for face detection, face recognition and other tasks.
For more details, please visit: https://www.nexdata.ai/datasets/1199?source=Huggingface
# Specifications
## Data size
3,919 people, 24 images and 9 videos per person
## Race distribution
Asians
## Nationality distribution
114 people from Cambodia, 1,951 people from Indonesia, 34 people from Korea, 234 people from Mongolia, 1,107 people from Philippines, 479 people from Vietnam
## Gender distribution
2,046 males, 1,873 females
## Age distribution
1,338 people under 18 years old, 1,975 people aged from 18 to 45, 404 people aged from 46 to 60,202 people over 60 years old
## Collecting environment
including indoor and outdoor scenes
## Data diversity
different face poses, nationalities, ages, light conditions, different scenes
## Device
cellphone
## Data format
the image data format is .jpeg, .jpg; the video data format is .mp4, .mov
## Accuracy
the accuracy of labels of face pose, head pose, nationality, gender, collection environment and age are more than 97%
# Licensing Information
Commercial License |
xwjiang2010/pile_dedupe_val_tokenized | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 6941817272
num_examples: 1000000
download_size: 3202764782
dataset_size: 6941817272
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amitness/logits-mt-it-ar-128 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: teacher_logits
sequence:
sequence: float64
- name: teacher_indices
sequence:
sequence: int64
- name: teacher_mask_indices
sequence: int64
splits:
- name: train
num_bytes: 53010711524
num_examples: 11706283
- name: test
num_bytes: 9355220532
num_examples: 2065815
download_size: 0
dataset_size: 62365932056
---
# Dataset Card for "logits-mt-it-ar-128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k | ---
pretty_name: Evaluation run of quantumaikr/llama-2-70b-fb16-guanaco-1k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/llama-2-70b-fb16-guanaco-1k](https://huggingface.co/quantumaikr/llama-2-70b-fb16-guanaco-1k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-10T00:33:03.607588](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k/blob/main/results_2023-08-10T00%3A33%3A03.607588.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7013441332798022,\n\
\ \"acc_stderr\": 0.03091715385865452,\n \"acc_norm\": 0.7054300239648517,\n\
\ \"acc_norm_stderr\": 0.030884754243271178,\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5756052671501329,\n\
\ \"mc2_stderr\": 0.014559658555893657\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6510238907849829,\n \"acc_stderr\": 0.013928933461382501,\n\
\ \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.013329750293382318\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.686018721370245,\n\
\ \"acc_stderr\": 0.004631603539751948,\n \"acc_norm\": 0.8733320055765784,\n\
\ \"acc_norm_stderr\": 0.00331920940013512\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947559,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947559\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"\
acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8354838709677419,\n\
\ \"acc_stderr\": 0.021090847745939306,\n \"acc_norm\": 0.8354838709677419,\n\
\ \"acc_norm_stderr\": 0.021090847745939306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.898989898989899,\n \"acc_stderr\": 0.02146973557605533,\n \"acc_norm\"\
: 0.898989898989899,\n \"acc_norm_stderr\": 0.02146973557605533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.022421273612923714,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.022421273612923714\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.02772206549336127,\n \
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.02772206549336127\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\
acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8935779816513761,\n \"acc_stderr\": 0.013221554674594372,\n \"\
acc_norm\": 0.8935779816513761,\n \"acc_norm_stderr\": 0.013221554674594372\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073312,\n\
\ \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n\
\ \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n\
\ \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515375,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786746,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786746\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n\
\ \"acc_stderr\": 0.012234384586856488,\n \"acc_norm\": 0.8646232439335888,\n\
\ \"acc_norm_stderr\": 0.012234384586856488\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5910614525139665,\n\
\ \"acc_stderr\": 0.016442830654715544,\n \"acc_norm\": 0.5910614525139665,\n\
\ \"acc_norm_stderr\": 0.016442830654715544\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.02058146613825712,\n\
\ \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.02058146613825712\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5709219858156028,\n \"acc_stderr\": 0.029525914302558562,\n \
\ \"acc_norm\": 0.5709219858156028,\n \"acc_norm_stderr\": 0.029525914302558562\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5560625814863103,\n\
\ \"acc_stderr\": 0.012689708167787679,\n \"acc_norm\": 0.5560625814863103,\n\
\ \"acc_norm_stderr\": 0.012689708167787679\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146613,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824657,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.0171938358120939,\n \"mc2\": 0.5756052671501329,\n\
\ \"mc2_stderr\": 0.014559658555893657\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/llama-2-70b-fb16-guanaco-1k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|arc:challenge|25_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hellaswag|10_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:33:03.607588.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:33:03.607588.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T00:33:03.607588.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T00:33:03.607588.parquet'
- config_name: results
data_files:
- split: 2023_08_10T00_33_03.607588
path:
- results_2023-08-10T00:33:03.607588.parquet
- split: latest
path:
- results_2023-08-10T00:33:03.607588.parquet
---
# Dataset Card for Evaluation run of quantumaikr/llama-2-70b-fb16-guanaco-1k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/quantumaikr/llama-2-70b-fb16-guanaco-1k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [quantumaikr/llama-2-70b-fb16-guanaco-1k](https://huggingface.co/quantumaikr/llama-2-70b-fb16-guanaco-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-10T00:33:03.607588](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-guanaco-1k/blob/main/results_2023-08-10T00%3A33%3A03.607588.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7013441332798022,
"acc_stderr": 0.03091715385865452,
"acc_norm": 0.7054300239648517,
"acc_norm_stderr": 0.030884754243271178,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.0171938358120939,
"mc2": 0.5756052671501329,
"mc2_stderr": 0.014559658555893657
},
"harness|arc:challenge|25": {
"acc": 0.6510238907849829,
"acc_stderr": 0.013928933461382501,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.013329750293382318
},
"harness|hellaswag|10": {
"acc": 0.686018721370245,
"acc_stderr": 0.004631603539751948,
"acc_norm": 0.8733320055765784,
"acc_norm_stderr": 0.00331920940013512
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947559,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947559
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8354838709677419,
"acc_stderr": 0.021090847745939306,
"acc_norm": 0.8354838709677419,
"acc_norm_stderr": 0.021090847745939306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.02146973557605533,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.02146973557605533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.022421273612923714,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.022421273612923714
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.02772206549336127,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.02772206549336127
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8935779816513761,
"acc_stderr": 0.013221554674594372,
"acc_norm": 0.8935779816513761,
"acc_norm_stderr": 0.013221554674594372
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073312,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884562,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515375,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786746,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786746
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856488,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856488
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5910614525139665,
"acc_stderr": 0.016442830654715544,
"acc_norm": 0.5910614525139665,
"acc_norm_stderr": 0.016442830654715544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.02058146613825712,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.02058146613825712
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5709219858156028,
"acc_stderr": 0.029525914302558562,
"acc_norm": 0.5709219858156028,
"acc_norm_stderr": 0.029525914302558562
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5560625814863103,
"acc_stderr": 0.012689708167787679,
"acc_norm": 0.5560625814863103,
"acc_norm_stderr": 0.012689708167787679
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146613,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824657,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.0171938358120939,
"mc2": 0.5756052671501329,
"mc2_stderr": 0.014559658555893657
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-high_school_geography-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 87904
num_examples: 198
download_size: 53734
dataset_size: 87904
---
# Dataset Card for "mmlu-high_school_geography-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_pt | ---
pretty_name: '`mmarco/pt`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/pt`
The `mmarco/pt` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/pt).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=8,841,823
This dataset is used by: [`mmarco_pt_dev`](https://huggingface.co/datasets/irds/mmarco_pt_dev), [`mmarco_pt_dev_small`](https://huggingface.co/datasets/irds/mmarco_pt_dev_small), [`mmarco_pt_dev_v1.1`](https://huggingface.co/datasets/irds/mmarco_pt_dev_v1.1), [`mmarco_pt_train`](https://huggingface.co/datasets/irds/mmarco_pt_train), [`mmarco_pt_train_v1.1`](https://huggingface.co/datasets/irds/mmarco_pt_train_v1.1)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/mmarco_pt', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-Instruct-v0.1 | ---
pretty_name: Evaluation run of Hemanth-thunder/Tamil-Mistral-7B-Instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Hemanth-thunder/Tamil-Mistral-7B-Instruct-v0.1](https://huggingface.co/Hemanth-thunder/Tamil-Mistral-7B-Instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-Instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T16:07:20.456036](https://huggingface.co/datasets/open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-Instruct-v0.1/blob/main/results_2024-03-21T16-07-20.456036.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2439956253766501,\n\
\ \"acc_stderr\": 0.030208590559409422,\n \"acc_norm\": 0.24516065997264377,\n\
\ \"acc_norm_stderr\": 0.03101226879683695,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871091,\n \"mc2\": 0.4727344950523299,\n\
\ \"mc2_stderr\": 0.01648671604259545\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2167235494880546,\n \"acc_stderr\": 0.012040156713481192,\n\
\ \"acc_norm\": 0.2738907849829352,\n \"acc_norm_stderr\": 0.013032004972989503\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2610037841067516,\n\
\ \"acc_stderr\": 0.004382844128643413,\n \"acc_norm\": 0.27155945030870343,\n\
\ \"acc_norm_stderr\": 0.004438549152538037\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106755,\n\
\ \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106755\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\": 0.15,\n\
\ \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.31290322580645163,\n \"acc_stderr\": 0.02637756702864586,\n \"\
acc_norm\": 0.31290322580645163,\n \"acc_norm_stderr\": 0.02637756702864586\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\
acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.26262626262626265,\n \"acc_stderr\": 0.031353050095330855,\n \"\
acc_norm\": 0.26262626262626265,\n \"acc_norm_stderr\": 0.031353050095330855\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863804,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20917431192660552,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.20917431192660552,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.36771300448430494,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n\
\ \"acc_stderr\": 0.02581923325648369,\n \"acc_norm\": 0.19230769230769232,\n\
\ \"acc_norm_stderr\": 0.02581923325648369\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n\
\ \"acc_stderr\": 0.016203792703197797,\n \"acc_norm\": 0.2886334610472541,\n\
\ \"acc_norm_stderr\": 0.016203792703197797\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\
\ \"acc_stderr\": 0.014614465821966337,\n \"acc_norm\": 0.2569832402234637,\n\
\ \"acc_norm_stderr\": 0.014614465821966337\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n\
\ \"acc_stderr\": 0.024926723224845536,\n \"acc_norm\": 0.2604501607717042,\n\
\ \"acc_norm_stderr\": 0.024926723224845536\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2529335071707953,\n\
\ \"acc_stderr\": 0.011102268713839989,\n \"acc_norm\": 0.2529335071707953,\n\
\ \"acc_norm_stderr\": 0.011102268713839989\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2536764705882353,\n \"acc_stderr\": 0.026431329870789534,\n\
\ \"acc_norm\": 0.2536764705882353,\n \"acc_norm_stderr\": 0.026431329870789534\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072774,\n\
\ \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072774\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n\
\ \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n\
\ \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.263681592039801,\n \"acc_stderr\": 0.03115715086935557,\n\
\ \"acc_norm\": 0.263681592039801,\n \"acc_norm_stderr\": 0.03115715086935557\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.18674698795180722,\n \"acc_stderr\": 0.030338749144500608,\n\
\ \"acc_norm\": 0.18674698795180722,\n \"acc_norm_stderr\": 0.030338749144500608\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871091,\n\
\ \"mc2\": 0.4727344950523299,\n \"mc2_stderr\": 0.01648671604259545\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.48697711128650356,\n\
\ \"acc_stderr\": 0.014047718393997667\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Hemanth-thunder/Tamil-Mistral-7B-Instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|arc:challenge|25_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|arc:challenge|25_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|gsm8k|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|gsm8k|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hellaswag|10_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hellaswag|10_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-00-28.551336.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-07-20.456036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T16-07-20.456036.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- '**/details_harness|winogrande|5_2024-03-21T16-00-28.551336.parquet'
- split: 2024_03_21T16_07_20.456036
path:
- '**/details_harness|winogrande|5_2024-03-21T16-07-20.456036.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T16-07-20.456036.parquet'
- config_name: results
data_files:
- split: 2024_03_21T16_00_28.551336
path:
- results_2024-03-21T16-00-28.551336.parquet
- split: 2024_03_21T16_07_20.456036
path:
- results_2024-03-21T16-07-20.456036.parquet
- split: latest
path:
- results_2024-03-21T16-07-20.456036.parquet
---
# Dataset Card for Evaluation run of Hemanth-thunder/Tamil-Mistral-7B-Instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Hemanth-thunder/Tamil-Mistral-7B-Instruct-v0.1](https://huggingface.co/Hemanth-thunder/Tamil-Mistral-7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-Instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T16:07:20.456036](https://huggingface.co/datasets/open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-Instruct-v0.1/blob/main/results_2024-03-21T16-07-20.456036.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2439956253766501,
"acc_stderr": 0.030208590559409422,
"acc_norm": 0.24516065997264377,
"acc_norm_stderr": 0.03101226879683695,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871091,
"mc2": 0.4727344950523299,
"mc2_stderr": 0.01648671604259545
},
"harness|arc:challenge|25": {
"acc": 0.2167235494880546,
"acc_stderr": 0.012040156713481192,
"acc_norm": 0.2738907849829352,
"acc_norm_stderr": 0.013032004972989503
},
"harness|hellaswag|10": {
"acc": 0.2610037841067516,
"acc_stderr": 0.004382844128643413,
"acc_norm": 0.27155945030870343,
"acc_norm_stderr": 0.004438549152538037
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.025757559893106755,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.025757559893106755
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.26262626262626265,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.26262626262626265,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959912,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959912
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20917431192660552,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.20917431192660552,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19230769230769232,
"acc_stderr": 0.02581923325648369,
"acc_norm": 0.19230769230769232,
"acc_norm_stderr": 0.02581923325648369
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2886334610472541,
"acc_stderr": 0.016203792703197797,
"acc_norm": 0.2886334610472541,
"acc_norm_stderr": 0.016203792703197797
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966337,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966337
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2604501607717042,
"acc_stderr": 0.024926723224845536,
"acc_norm": 0.2604501607717042,
"acc_norm_stderr": 0.024926723224845536
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2529335071707953,
"acc_stderr": 0.011102268713839989,
"acc_norm": 0.2529335071707953,
"acc_norm_stderr": 0.011102268713839989
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2536764705882353,
"acc_stderr": 0.026431329870789534,
"acc_norm": 0.2536764705882353,
"acc_norm_stderr": 0.026431329870789534
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.18674698795180722,
"acc_stderr": 0.030338749144500608,
"acc_norm": 0.18674698795180722,
"acc_norm_stderr": 0.030338749144500608
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871091,
"mc2": 0.4727344950523299,
"mc2_stderr": 0.01648671604259545
},
"harness|winogrande|5": {
"acc": 0.48697711128650356,
"acc_stderr": 0.014047718393997667
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
damlab/uniprot | ---
liscence: mit
---
# Dataset Description
## Dataset Summary
This dataset is a mirror of the Uniprot/SwissProt database. It contains the names and sequences of >500K proteins.
This dataset was parsed from the FASTA file at https://ftp.uniprot.org/pub/databases/uniprot/current_release/knowledgebase/complete/uniprot_sprot.fasta.gz.
Supported Tasks and Leaderboards: None
Languages: English
## Dataset Structure
### Data Instances
Data Fields: id, description, sequence
Data Splits: None
## Dataset Creation
The dataset was downloaded and parsed into a `dataset` object and uploaded unchanged.
Initial Data Collection and Normalization: Dataset was downloaded and curated on 03/09/2022.
## Considerations for Using the Data
Social Impact of Dataset: Due to the tendency of HIV to mutate, drug resistance is a common issue when attempting to treat those infected with HIV.
Protease inhibitors are a class of drugs that HIV is known to develop resistance via mutations.
Thus, by providing a collection of protease sequences known to be resistant to one or more drugs, this dataset provides a significant collection of data that could be utilized to perform computational analysis of protease resistance mutations.
Discussion of Biases: Due to the sampling nature of this database, it is predominantly composed genes from "well studied" genomes. This may impact the "broadness" of the genes contained.
## Additional Information:
- Dataset Curators: Will Dampier
- Citation Information: TBA
|
cjayesh/dreambooth-images | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 136191.0
num_examples: 12
download_size: 127724
dataset_size: 136191.0
---
# Dataset Card for "dreambooth-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sun-cake/3dGS-js-source | ---
license: gpl-3.0
---
|
Rimyy/problemMath-Llama1K | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 797941
num_examples: 1000
download_size: 349138
dataset_size: 797941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
qgallouedec/prj_gia_dataset_metaworld_box_close_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the box-close-v2 environment, sample for the policy box-close-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_box_close_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_box_close_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
OpenShape/openshape-objaverse-embeddings | ---
license: mit
---
|
heliosprime/twitter_dataset_1712923800 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 7446
num_examples: 17
download_size: 9107
dataset_size: 7446
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712923800"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
loubnabnl/stack-filtered-pii-1M-java | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: float64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: float64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: float64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
- name: index
dtype: int64
- name: content
dtype: string
splits:
- name: train
num_bytes: 5117781075
num_examples: 1000000
download_size: 1880524833
dataset_size: 5117781075
---
# Dataset Card for "stack-filtered-pii-1M-java"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
michaelnath/annotated_github_dataset_2 | ---
dataset_info:
features:
- name: function
dtype: string
- name: repo_name
dtype: string
- name: path
dtype: string
- name: features
sequence: float32
- name: purpose
dtype: string
- name: detailed_description
dtype: string
- name: code_trans
dtype: string
splits:
- name: train
num_bytes: 8222665
num_examples: 10003
download_size: 2821232
dataset_size: 8222665
---
# Dataset Card for "annotated_github_dataset_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_standing_stood | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 3814
num_examples: 17
- name: dev_mismatched
num_bytes: 593
num_examples: 5
- name: test_matched
num_bytes: 976
num_examples: 5
- name: test_mismatched
num_bytes: 3984
num_examples: 13
- name: train
num_bytes: 55386
num_examples: 290
download_size: 46080
dataset_size: 64753
---
# Dataset Card for "MULTI_VALUE_mnli_standing_stood"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ywpl/Model_base_YWPL | ---
license: unknown
---
|
daspartho/stable-diffusion-prompts | ---
language: en
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 284636288
num_examples: 1819808
download_size: 101931289
dataset_size: 284636288
---
Subset dataset of [diffusiondb](https://huggingface.co/datasets/poloclub/diffusiondb) consisting of just unique prompts.
Created this subset dataset for the [Prompt Extend](https://github.com/daspartho/prompt-extend) project. |
emergentorder/StarTrekMemoryAlpha20230216 | ---
annotations_creators: []
language:
- en
language_creators:
- found
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
pretty_name: Memory Alpha - The Star Trek Wiki -Full Database Dump as of 20230216
size_categories:
- 10K<n<100K
source_datasets: []
tags:
- star trek
- memory alpha
task_categories:
- fill-mask
task_ids:
- masked-language-modeling
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 115575629
num_examples: 54234
download_size: 64791573
dataset_size: 115575629
---
|
DopeorNope/Sampled_SSL | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 176006987
num_examples: 43473
download_size: 88842730
dataset_size: 176006987
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adalib/megengine-data | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 2597473
num_examples: 216
- name: test
num_bytes: 408242
num_examples: 57
download_size: 989124
dataset_size: 3005715
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
TheGreatRambler/mm2_level_deaths | ---
language:
- multilingual
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
size_categories:
- 100M<n<1B
source_datasets:
- original
task_categories:
- other
- object-detection
- text-retrieval
- token-classification
- text-generation
task_ids: []
pretty_name: Mario Maker 2 level deaths
tags:
- text-mining
---
# Mario Maker 2 level deaths
Part of the [Mario Maker 2 Dataset Collection](https://tgrcode.com/posts/mario_maker_2_datasets)
## Dataset Description
The Mario Maker 2 level deaths dataset consists of 564 million level deaths from Nintendo's online service totaling around 2.5GB of data. The dataset was created using the self-hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api) over the course of 1 month in February 2022.
### How to use it
The Mario Maker 2 level deaths dataset is a very large dataset so for most use cases it is recommended to make use of the streaming API of `datasets`. You can load and iterate through the dataset with the following code:
```python
from datasets import load_dataset
ds = load_dataset("TheGreatRambler/mm2_level_deaths", streaming=True, split="train")
print(next(iter(ds)))
#OUTPUT:
{
'data_id': 3000382,
'x': 696,
'y': 0,
'is_subworld': 0
}
```
Each row is a unique death in the level denoted by the `data_id` that occurs at the provided coordinates. `is_subworld` denotes whether the death happened in the main world or the subworld.
You can also download the full dataset. Note that this will download ~2.5GB:
```python
ds = load_dataset("TheGreatRambler/mm2_level_deaths", split="train")
```
## Data Structure
### Data Instances
```python
{
'data_id': 3000382,
'x': 696,
'y': 0,
'is_subworld': 0
}
```
### Data Fields
|Field|Type|Description|
|---|---|---|
|data_id|int|The data ID of the level this death occured in|
|x|int|X coordinate of death|
|y|int|Y coordinate of death|
|is_subworld|bool|Whether the death happened in the main world or the subworld|
### Data Splits
The dataset only contains a train split.
<!-- TODO create detailed statistics -->
## Dataset Creation
The dataset was created over a little more than a month in Febuary 2022 using the self hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api). As requests made to Nintendo's servers require authentication the process had to be done with upmost care and limiting download speed as to not overload the API and risk a ban. There are no intentions to create an updated release of this dataset.
## Considerations for Using the Data
The dataset contains no harmful language or depictions.
|
CyberHarem/zhongli_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zhongli_genshin
This is the dataset of zhongli_genshin, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 393 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 393 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 393 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 393 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
FaalSa/f13 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 79711
num_examples: 1
- name: validation
num_bytes: 80191
num_examples: 1
- name: test
num_bytes: 80671
num_examples: 1
download_size: 45870
dataset_size: 240573
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
irds/lotte_technology_test_search | ---
pretty_name: '`lotte/technology/test/search`'
viewer: false
source_datasets: ['irds/lotte_technology_test']
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/technology/test/search`
The `lotte/technology/test/search` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/technology/test/search).
# Data
This dataset provides:
- `queries` (i.e., topics); count=596
- `qrels`: (relevance assessments); count=2,045
- For `docs`, use [`irds/lotte_technology_test`](https://huggingface.co/datasets/irds/lotte_technology_test)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/lotte_technology_test_search', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/lotte_technology_test_search', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
ejazhabibdar/FloorPlanDesign | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3527550.0
num_examples: 30
download_size: 3486974
dataset_size: 3527550.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
su-fmi/msi-drone-crop-surveys | ---
license: cc-by-4.0
language:
- en
pretty_name: Aerial surveys of a sunflower crop’s lifecycle from April to September 2023
size_categories:
- 100K<n<1M
---
# Dataset Metadata
## Identification Information
### Citation
- **Title**:Aerial surveys of a sunflower crop’s lifecycle from April to September 2023
- **Originator**: Sofia University - faculty of mathematics and informatics, SAP LABS Bulgaria
- **Publication Date**: 2023.11.08
### Abstract
Efficient food production is shaping up to be one of the new frontiers for new technologies and solutions. One such prominent domain is the remote sensing ecosystem, and more precicely, technologies such as multispectral and hyperspectral sensing equipment.
These devices are gradually moving from the academia environment to the industry world, and there decrease is cost allows for many new applications to emerge.
Multispectral drones are advanced unmanned aerial vehicles (UAVs) equipped with cameras or sensors, capable of capturing imagery across multiple spectral bands. Unlike traditional RGB counterparts, they capture data not only within, but also beyond the visible spectrum, such as near-infrared (NIR). This data can provide valuable insights for various applications, including agriculture, environmental monitoring, land surveying, and more.
One of the main uses of multispectral drones in agriculture is related to the calculation of vegetation (NDVI, NDRE etc.) and other indices that inform the farmer about crop development, stress etc. The latter can also serve as indirect indicator of soil conditions and water distribution. This approach enables more accurate and detailed assessments compared to traditional visual inspections.
Similar multispectral data is provided by earth observation satellites, such as Sentinel-2, however they are limited with respect to revisit time, spatial resolution and most importantly, their inability to see through clouds. Therefore, the use of multispectral drones can fill these operational gaps and provide more precise and timely data to the farmers.
However, to work simultaneously with satellite and drone data, analysts must have confidence in the precision and comparability of these two data sources (e.g., for NDVI). For example, the DJI P4 multispectral images have slightly different band sensitivities when compared with Sentinel-2, which may cause deviations in the index values. Another prominent problem is related to the field illumination, which depends on time of day and weather conditions. Even though the DJI P4 drone has a calibration sensor, supposed to compensate for the illuminating spectrum deviations, to the best of our knowledge, no public data set exists that demonstrates the tolerance of deviations between e.g., different drone footages or between DJI P4 and Sentinel-2. Moreover, Sentinel-2 implements atmospheric corrections that may contribute to such deviations as well.
Machine learning models can be utilized to extract valuable insights from multispectral data in precision agriculture applications. By leveraging the rich information captured across multiple spectral bands, machine learning algorithms can analyze and interpret the data to provide actionable recommendations for farmers and agronomists, such as highlighting areas with the most vegetation stress. Successful implementation of machine learning models for precision agriculture, based on multispectral data, requires high quality data sets, which are currently scarce. Therefore, collection of a high-quality, multispectral data set is a prerequisite to future machine learning experiments in the domain of precision farming.
For these reasons, our research team conducted multiple surveys, tracking the entire lifecycle of a sunflower field and gathering spectal data.
### Purpose
This dataset was developed as part of a research project, investigating the capabilities and application of drones and multispectral cameras for the agricultural domain.
The provided data can be used for the following scenarios:
1) Training models relying on multispectral datasources.
2) Improve existing algorithms in the computer vision domain.
## Time Period of Content
- **Single Date/Time**: Start Date 2023-04-25 to End Date 2023-09-04
## Data Quality Information
Composite images have been generated with DJI Terra, with 70% frontal and 60% side overlap.
There are instances where a survey has been completed in the span of 2 days due to adverse environment conditions.
Although there was an effort to have surveys execution in a constant time window (morning and afternoon), for some of the runs this is not the case.
The raw data is validated to be complete - representing the entirety of the observed field for every survey.
### Horizontal Coordinate System
- **Geographic Coordinate System**: EPSG:4326
- **Angular Unit**: Decimal degrees
- **Datum**: WGS 84
- **Prime Meridian**: Greenwich
- **Domain**: Raster
## Entity and Attribute Information
### Detailed Description
#### Entities
Data is organized into directories. Each directory corresponds to one survey and uses **DD.MM.YYYY** format.
Each survey directory contains 2 subdirectories : **raw** and **results**.
results directory is the output from the DJI Terra processing of the raw data, collected by the drone.
- Contents:
- raw
- Composite images, derived from a single drone sensor. Images follow **result_<Blue, Green, etc.>** nomenclature.
- .prj projection file for every composite image
- .tfw georeference file for every composite image
- results
- subdirectories for each executed flight, required to complete the survey.
- each subdirectory keeps the raw data for each sensing point on the drone's mission path
- one point is represented by one JPG image and 5 grayscale TIF images, corresponding to each sensor of the drone

<p align="center">Composite image sample</p>

<p align="center">Raw data images</p>
All images are injected with geo-referencing data, timestamps, image quality, camera properties.
The datasets hold additional metadata in two files:
- field_shape.geojson - bounding box for the sunflower field
- crop_details.txt - information about the crop
#### Capture aperture
Drone surveys are executed with DJI Phantom 4 Multispectral drone. The drone uses the following sensors to capture data:
Sensors: Six 1/2.9” CMOS
Filters:
- Blue (B): 450 nm ± 16 nm
- Green (G): 560 nm ± 16 nm
- Red (R): 650 nm ± 16 nm
- Red edge (RE): 730 nm ± 16 nm
- Near-infrared (NIR): 840 nm ± 26 nm
Lenses:
- FOV (Field of View): 62.7°
- Focal Length: 5.74 mm
- Aperture: f/2.2
Software used for generating composite images: DJI Terra 3.6.8.
## Metadata Reference Information
- **Metadata Contact**:
- **Name**: Pavel Genevski
- **Organization**: SAP LABS Bulgaria
- **Position**: Research expert
- **Email**: pavel.genevski@sap.com
- **Metadata Contact**:
- **Name**: Radoslav Stefanov
- **Organization**: SAP LABS Bulgaria
- **Position**: Senior developer
- **Email**: radoslav.stefanov@sap.com
- **Metadata Date**: Date of creating this metadata (2023.11.08)
- **Metadata Standard Name**: FGDC Content Standard for Digital Geospatial Metadata
## Additional Information
- **Keywords**: agriculture, multispectral, crop, sunflower
- **Access Constraints**: CC BY 4.0
- **Use Constraints**: CC BY 4.0 |
LaMOP/Basis-Latin-French |
---
annotations_creators:
- no-annotation
license: cc-by-sa-4.0
task_categories:
- mask-generation
language:
- la
tags:
- latin
---
# Dataset Card for Basis-Latin-French
<!-- Provide a quick summary of the dataset. -->
The Basis-Latin-French dataset is an unannotated Latin and old French corpus of nearly 80 million words, compiled from different resources from the web. This resources include the Corpus de la Bourgogne du Moyen Âge, The e-NDP project, HIMANIS Guérin and the HOME-Alcar project, the Corpus Cisterciens et Ressources and dump from the Latin Wikisource.
### Dataset Sources [optional]
***[Corpus de la Bourgogne du Moyen Âge](http://www.cbma-project.eu/)***
"Projet CBMA - Corpus Burgundiae Medii Aevi. Site du projet Corpus de la Bourgogne du Moyen Âge, [En ligne]. http://www.cbma-project.eu (consulté le 10/02/2024)."
***The e-NDP project***: collaborative digital edition of the Chapter registers of Notre-Dame of Paris (1326-1504). Ground-truth for handwriting text recognition (HTR) on late medieval manuscripts.
[https://zenodo.org/records/7575693](https://zenodo.org/records/7575693)
"Claustre, J., Smith, D., Torres Aguilar, S., Bretthauer, I., Brochard, P., Canteaut, O., Cottereau, E., Delivré, F., Denglos, M., Jolivet, V., Julerot, V., Kouamé, T., Lusset, E., Massoni, A., Nadiras, S., Perreaux, N., Regazzi, H., & Treglia, M. (2023). The e-NDP project : collaborative digital edition of the Chapter registers of Notre-Dame of Paris (1326-1504). Ground-truth for handwriting text recognition (HTR) on late medieval manuscripts. (1.0, p. https://zenodo.org/record/7575693) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.7575693"
***HIMANIS Guérin***
[https://zenodo.org/records/5535306](https://zenodo.org/records/5535306)
"Stutzmann, D., Hamel, S., Kernier, I. de ., Mühlberger, G., & Hackl, G. (2021). HIMANIS Guérin [Data set]. Zenodo. https://doi.org/10.5281/zenodo.5535306"
***HOME-Alcar: Aligned and Annotated Cartularies***
[https://zenodo.org/records/5600884](https://zenodo.org/records/5600884)
"Stutzmann, D., Torres Aguilar, S., & Chaffenet, P. (2021). HOME-Alcar: Aligned and Annotated Cartularies [Data set]. Zenodo. https://doi.org/10.5281/zenodo.5600884"
***Corpus Cisterciens et Ressources***
[https://cisterciensetressources.lamop.fr/](https://cisterciensetressources.lamop.fr/)
"Corpus Cisterciens et Ressources - Benoit Rouzeau, Danielle Arribet-Deroin, Pierre Brochard, Version 1.1, mise en ligne 22/09/2022, consulté le 10/02/2024. URL : https://cisterciensetressources.lamop.fr/"
***Wikisource Dump*** (lawikisource-20240201-pages-meta-current.xml)
[https://dumps.wikimedia.org/lawikisource/20240201/](https://dumps.wikimedia.org/lawikisource/20240201/)
## Dataset Structure
### Data Fields
text: a sentence in Latin
{
"text": "Yvo secundus, de Chasant, Abbas XXVI. in die sepulturæ Yuonis primi, prædecessoris sui electus fuit. Sed incœpit regere anno 1275. Rexit annis XIIII. obiit III. Nonas Nouembris 1289. Iacet apud Cluniacum inter altaria S. Andreæ, & S. Clementis. Iste secundus Yuo fuit Prior S. Martini Parisiensis. Venerandus namque ipse Pater bonæ memoriæ, affluens misericordiæ visceribus specialiter erga Conuentum Cluniacensem. Nam impetrauit quoddam priuilegium in quo continetur, quod domnus Papa inhibet districte & districtius, quod nullus Abbas se intromittat de pictanciis Conuentus, nisi de eius Conuentus voluntate. Item apud Giureium in montana acquisiuit a Domino Duce Burgundiæ magnam, & altam iustitiā dicti loci, & tres homines tailliabiles, quos Dominus Dux habebat in prædicta villa. Item fecit castrum de Giureio. Item fecit domum nouam, granarium de auena, & torcular. Itē perfecit imaginem B. Marię, quę est de auro, pro qua eius prædecessor immediatus Yuo dimisit XXVI. marchas auri, ex quibus facta fuit. Item capsam S. Margaretæ. Item Sanctuarium, siue vexillum de argento, quod portant duo Angeli. Item tres cappas ad imagines factas. Item fecit domos nouas de Botauant. Item & de Besornay. Item de Escurolles, & muros in circuitu. Item acquisiuit a Domino Belli-ioci talliam, quam dictus domnus faciebat hominib. Ecclesiæ Cluniac. in terra sua, quando erat domnus nouus. Item constituit vinum purum in solemnitatibus Sanctorum Abbatum Cluniacensium. Item in solemnitate beatæ Mariæ Magdalenæ vinum purum cum flaconibus. Item statuit charitatem de vino puro, quando aliquis Monachus moritur in Monasterio Cluniacensi. Item statuit octo cereos in Capella beatæ Mariæ de infirmariis, qui accenduntur quando venit Conuentus in dictam Capellam. Anima eius requiescat in pace."
},
### Data Splits
The dataset is not split. |
odunola/bible-reference-sentence-pair | ---
license: apache-2.0
---
|
pharaouk/dharma-test2 | ---
configs:
- config_name: default
data_files:
- split: 'dharma-test2_shuffled'
path: final/dharma-test2_eval_shuffled*
- split: 'dharma-test2_unshuffled'
path: final/dharma-test2_eval_unshuffled*
---
# "dharma-test2 Dataset"
A dharma evaluation dataset with the following configuration:
||| Subject: MMLU, Size: 12 |||
||| Subject: ARC-Challenge, Size: 12 |||
||| Subject: ARC-Easy, Size: 12 |||
||| Subject: BoolQ, Size: 12 |||
||| Subject: winogrande, Size: 12 |||
||| Subject: openbookqa, Size: 12 |||
||| Subject: truthful_qa, Size: 12 |||
||| Subject: agieval, Size: 12 |||
Made with https://github.com/pharaouk/dharma 🚀
|
m-newhauser/food101 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple_pie
'1': baby_back_ribs
'2': baklava
'3': beef_carpaccio
'4': beef_tartare
'5': beet_salad
'6': beignets
'7': bibimbap
'8': bread_pudding
'9': breakfast_burrito
'10': bruschetta
'11': caesar_salad
'12': cannoli
'13': caprese_salad
'14': carrot_cake
'15': ceviche
'16': cheesecake
'17': cheese_plate
'18': chicken_curry
'19': chicken_quesadilla
'20': chicken_wings
'21': chocolate_cake
'22': chocolate_mousse
'23': churros
'24': clam_chowder
'25': club_sandwich
'26': crab_cakes
'27': creme_brulee
'28': croque_madame
'29': cup_cakes
'30': deviled_eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs_benedict
'35': escargots
'36': falafel
'37': filet_mignon
'38': fish_and_chips
'39': foie_gras
'40': french_fries
'41': french_onion_soup
'42': french_toast
'43': fried_calamari
'44': fried_rice
'45': frozen_yogurt
'46': garlic_bread
'47': gnocchi
'48': greek_salad
'49': grilled_cheese_sandwich
'50': grilled_salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot_and_sour_soup
'55': hot_dog
'56': huevos_rancheros
'57': hummus
'58': ice_cream
'59': lasagna
'60': lobster_bisque
'61': lobster_roll_sandwich
'62': macaroni_and_cheese
'63': macarons
'64': miso_soup
'65': mussels
'66': nachos
'67': omelette
'68': onion_rings
'69': oysters
'70': pad_thai
'71': paella
'72': pancakes
'73': panna_cotta
'74': peking_duck
'75': pho
'76': pizza
'77': pork_chop
'78': poutine
'79': prime_rib
'80': pulled_pork_sandwich
'81': ramen
'82': ravioli
'83': red_velvet_cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed_salad
'89': shrimp_and_grits
'90': spaghetti_bolognese
'91': spaghetti_carbonara
'92': spring_rolls
'93': steak
'94': strawberry_shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna_tartare
'100': waffles
- name: id
dtype: int64
- name: image_base64
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 7448625547.25
num_examples: 75750
download_size: 7525378584
dataset_size: 7448625547.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Sesamoo/pdf-3dsimulation | ---
license: unknown
---
|
jrs-a/batangueno-accent | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: input_length
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 244706143.0
num_examples: 471
download_size: 225571755
dataset_size: 244706143.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "batangueno-accent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/reed_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of reed/リード/苇草 (Arknights)
This is the dataset of reed/リード/苇草 (Arknights), containing 398 images and their tags.
The core tags of this character are `horns, long_hair, dragon_horns, blonde_hair, dragon_girl, ahoge, tail, breasts, dragon_tail, green_eyes, blue_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 398 | 774.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reed_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 398 | 636.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reed_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 982 | 1.19 GiB | [Download](https://huggingface.co/datasets/CyberHarem/reed_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/reed_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, solo, looking_at_viewer, white_dress, white_flower, black_jacket, hair_flower, upper_body, closed_mouth, simple_background, white_background, black_gloves, open_jacket, holding_flower, smile |
| 1 | 5 |  |  |  |  |  | 1girl, black_sweater, earrings, long_sleeves, looking_at_viewer, ribbed_sweater, simple_background, solo, white_background, white_jacket, black_gloves, closed_mouth, cowboy_shot, open_jacket, turtleneck, black_dress, grey_hair, medium_breasts, off_shoulder, white_hair |
| 2 | 5 |  |  |  |  |  | 1girl, black_gloves, black_sweater, dress, flame-tipped_tail, holding_polearm, long_sleeves, looking_at_viewer, ribbed_sweater, solo, white_jacket, black_footwear, black_thighhighs, fire, simple_background, thigh_boots, full_body, spear, grey_background, holding_staff, jewelry, open_jacket, white_background, white_hair |
| 3 | 9 |  |  |  |  |  | 1girl, long_sleeves, solo, black_headwear, black_jacket, braid, looking_at_viewer, earrings, witch_hat, closed_mouth, collared_shirt, open_jacket, white_shirt, black_footwear, black_necktie, diagonal-striped_clothes, diagonal-striped_necktie, fire, flame-tipped_tail, holding_book, off_shoulder, pantyhose, smile, white_background, white_dress |
| 4 | 8 |  |  |  |  |  | 1girl, simple_background, collarbone, completely_nude, looking_at_viewer, navel, blush, solo, nipples, large_breasts, stomach, white_background, closed_mouth, cowboy_shot, flame-tipped_tail, medium_breasts, pussy, smile, spread_legs, sweat, uncensored |
| 5 | 7 |  |  |  |  |  | 1girl, blush, 1boy, hetero, nipples, penis, mosaic_censoring, nude, pussy, solo_focus, blunt_bangs, ejaculation, large_breasts, medium_breasts, navel, spread_legs, sweat, cum_on_hair, erection, facial, looking_at_viewer, on_back, parted_lips, sex, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | white_dress | white_flower | black_jacket | hair_flower | upper_body | closed_mouth | simple_background | white_background | black_gloves | open_jacket | holding_flower | smile | black_sweater | earrings | long_sleeves | ribbed_sweater | white_jacket | cowboy_shot | turtleneck | black_dress | grey_hair | medium_breasts | off_shoulder | white_hair | dress | flame-tipped_tail | holding_polearm | black_footwear | black_thighhighs | fire | thigh_boots | full_body | spear | grey_background | holding_staff | jewelry | black_headwear | braid | witch_hat | collared_shirt | white_shirt | black_necktie | diagonal-striped_clothes | diagonal-striped_necktie | holding_book | pantyhose | collarbone | completely_nude | navel | blush | nipples | large_breasts | stomach | pussy | spread_legs | sweat | uncensored | 1boy | hetero | penis | mosaic_censoring | nude | solo_focus | blunt_bangs | ejaculation | cum_on_hair | erection | facial | on_back | parted_lips | sex | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------|:---------------|:---------------|:--------------|:-------------|:---------------|:--------------------|:-------------------|:---------------|:--------------|:-----------------|:--------|:----------------|:-----------|:---------------|:-----------------|:---------------|:--------------|:-------------|:--------------|:------------|:-----------------|:---------------|:-------------|:--------|:--------------------|:------------------|:-----------------|:-------------------|:-------|:--------------|:------------|:--------|:------------------|:----------------|:----------|:-----------------|:--------|:------------|:-----------------|:--------------|:----------------|:---------------------------|:---------------------------|:---------------|:------------|:-------------|:------------------|:--------|:--------|:----------|:----------------|:----------|:--------|:--------------|:--------|:-------------|:-------|:---------|:--------|:-------------------|:-------|:-------------|:--------------|:--------------|:--------------|:-----------|:---------|:----------|:--------------|:------|:----------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | | | | | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | | | | | | X | X | X | X | | | X | | X | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | X | X | | X | | | X | | X | | X | | X | | X | X | | | | | | | | X | | | X | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | | | | | | X | X | X | | | | X | | | | | | X | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_rte_present_perfect_for_past | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 817036
num_examples: 2176
- name: train
num_bytes: 701912
num_examples: 1784
download_size: 963637
dataset_size: 1518948
---
# Dataset Card for "MULTI_VALUE_rte_present_perfect_for_past"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hyperdemocracy/usc-billstatus | ---
configs:
- config_name: default
data_files:
- split: '108'
path: data/usc-108-billstatus.parquet
- split: '109'
path: data/usc-109-billstatus.parquet
- split: '110'
path: data/usc-110-billstatus.parquet
- split: '111'
path: data/usc-111-billstatus.parquet
- split: '112'
path: data/usc-112-billstatus.parquet
- split: '113'
path: data/usc-113-billstatus.parquet
- split: '114'
path: data/usc-114-billstatus.parquet
- split: '115'
path: data/usc-115-billstatus.parquet
- split: '116'
path: data/usc-116-billstatus.parquet
- split: '117'
path: data/usc-117-billstatus.parquet
- split: '118'
path: data/usc-118-billstatus.parquet
license: mit
language:
- en
---
# Dataset Description
This dataset is part of a family of datasets that provide convenient access to
congressional data from the US [Government Publishing Office](https://www.gpo.gov/)
via the [GovInfo Bulk Data Repository](https://www.govinfo.gov/developers).
GovInfo provides bulk data in xml format.
The raw xml files were downloaded using the
[congress](https://github.com/unitedstates/congress) repo.
Further processing was done using the
hyperdemocracy [congress_prep](https://github.com/hyperdemocracy/congress-prep) repo.
# Hyperdemocracy Datasets
* [usc-billstatus](https://huggingface.co/datasets/hyperdemocracy/usc-billstatus) (metadata on each bill)
* [usc-textversions](https://huggingface.co/datasets/hyperdemocracy/usc-textversions) (different text versions of bills)
* [usc-unified](https://huggingface.co/datasets/hyperdemocracy/usc-unified) (combined metadata and text versions)
# BILLSTATUS (metadata for congresses 108-118)
* https://www.govinfo.gov/bulkdata/BILLSTATUS
* https://github.com/usgpo/bill-status/blob/main/BILLSTATUS-XML_User_User-Guide.md
* https://github.com/usgpo/bulk-data/blob/main/Bills-XML-User-Guide.md
These xml files contain metadata about each bill and
pointers to different xml files that contain various text versions of each bill.
# Column Descriptions
| Column | Description |
|--------|-------------|
| legis_id | a unique ID for each bill (`{congress_num}-{legis_type}-{legis_num}`) |
| congress_num | the congress number for the bill |
| legis_type | one of [`hr`, `hres`, `hconres`, `hjres`, `s`, `sres`, `sconres`, `sjres`] (see [govinfo - types of legislation](https://www.govinfo.gov/help/bills)) |
| legis_num | bills in each congress and of each type get an incrementing number as part of their ID |
| scrape_path | XML file path during bulk download |
| lastmod | lastmod date during bulk download |
| bs_xml | contents of billstatus XML file |
| bs_json| billstatus XML parsed into JSON |
# Examples
The dataset is broken into splits (one split per congress number).
```python
from datasets import load_dataset
# load each split into a `DatasetDict` keyed on congress number
dsd = load_dataset(path="hyperdemocracy/usc-billstatus")
# load a single congress number into a `Dataset`
ds = load_dataset(path="hyperdemocracy/usc-billstatus", split=117)
# load all congress numbers into a single `Dataset`
ds = load_dataset(path="hyperdemocracy/usc-billstatus", split="all")
```
# Congress Number to Date Mapping
| Congress Number | Years | Metadata | Text |
|-----------------|-------|----------|------|
| 118 | 2023-2024 | True | True |
| 117 | 2021-2022 | True | True |
| 116 | 2019-2020 | True | True |
| 115 | 2017-2018 | True | True |
| 114 | 2015-2016 | True | True |
| 113 | 2013-2014 | True | True |
| 112 | 2011-2012 | True | False |
| 111 | 2009-2010 | True | False |
| 110 | 2007-2008 | True | False |
| 109 | 2005-2006 | True | False |
| 108 | 2003-2004 | True | False |
|
lawful-good-project/ipc-inst-2k | ---
license: gpl-3.0
task_categories:
- text-generation
language:
- ru
tags:
- legal
size_categories:
- 1K<n<10K
---
Датасет судебных решений суда по интеллектуальным правам РФ с синтаксисом для дообучения с инструкциями. |
zolak/twitter_dataset_80_1713225540 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 141421
num_examples: 349
download_size: 78121
dataset_size: 141421
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CATIE-AQ/fquad_fr_prompt_context_generation_with_answer | ---
language:
- fr
license:
- cc-by-nc-sa-3.0
size_categories:
- 100k<n<1M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- fquad
---
# fquad_fr_prompt_context_generation_with_answer
## Summary
**fquad_fr_prompt_context_generation_with_answer** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **574,056** rows that can be used for a text generation task.
The original data (without prompts) comes from the dataset [FQuAD]( https://huggingface.co/datasets/fquad) by d'Hoffschmidt et al. and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
As FQuAD's license does not allow data to be shared, we simply share the prompts used, so that users can recreate the dataset themselves in the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
24 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Étant donné la réponse "'+ answer+'", écrire un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", écris un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", écrivez un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", rédiger un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", rédige un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", rédigez un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", générer un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", génère un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", générez un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", créer un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", crée un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'", créez un texte explicatif.\nTexte : ',
'Ecrire un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Ecris un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Ecrivez un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Rédiger un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Rédige un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Rédigez un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Générer un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Génère un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Générez un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Créer un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Crée un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
'Créez un texte comme contexte de la réponse "'+ answer+'" \nTexte : ',
```
# Splits
- `train` with 497,544 samples
- `valid` with 76,512 samples
- no test split
# How to use?
This repository doesn't contain any data.
# Citation
## Original data
> @ARTICLE{2020arXiv200206071
author = {Martin, d'Hoffschmidt and Maxime, Vidal and Wacim, Belblidia and Tom, Brendlé},
title = "{FQuAD: French Question Answering Dataset}",
journal = {arXiv e-prints},
keywords = {Computer Science - Computation and Language},
year = "2020",
month = "Feb",
eid = {arXiv:2002.06071},
pages = {arXiv:2002.06071},
archivePrefix = {arXiv},
eprint = {2002.06071},
primaryClass = {cs.CL}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
CC BY-NC-SA 3.0 |
lum-ai/metal-python-ood-climate-explanatations | ---
dataset_info:
features:
- name: id
dtype: string
- name: chunk_id
dtype: string
- name: text
dtype: string
- name: start_text
dtype: int64
- name: stop_text
dtype: int64
- name: code
dtype: string
- name: start_code
dtype: int64
- name: stop_code
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 799974
num_examples: 94
download_size: 48110
dataset_size: 799974
---
# Dataset Card for "metal-python-ood-climate-explanatations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_one_relativizer | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 1854205
num_examples: 8837
- name: dev_mismatched
num_bytes: 1951863
num_examples: 8852
- name: test_matched
num_bytes: 1856870
num_examples: 8801
- name: test_mismatched
num_bytes: 1958840
num_examples: 8910
- name: train
num_bytes: 74359218
num_examples: 349487
download_size: 52874996
dataset_size: 81980996
---
# Dataset Card for "MULTI_VALUE_mnli_one_relativizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload | ---
pretty_name: Evaluation run of Lajonbot/vicuna-7b-v1.5-PL-lora_unload
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lajonbot/vicuna-7b-v1.5-PL-lora_unload](https://huggingface.co/Lajonbot/vicuna-7b-v1.5-PL-lora_unload)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T03:39:03.666834](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload/blob/main/results_2023-09-23T03-39-03.666834.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0065016778523489934,\n\
\ \"em_stderr\": 0.0008230684297223919,\n \"f1\": 0.06541946308724841,\n\
\ \"f1_stderr\": 0.0015883719778429714,\n \"acc\": 0.3959174184839032,\n\
\ \"acc_stderr\": 0.009871427981667812\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0065016778523489934,\n \"em_stderr\": 0.0008230684297223919,\n\
\ \"f1\": 0.06541946308724841,\n \"f1_stderr\": 0.0015883719778429714\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07202426080363912,\n \
\ \"acc_stderr\": 0.007121147983537124\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7198105761641673,\n \"acc_stderr\": 0.012621707979798499\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Lajonbot/vicuna-7b-v1.5-PL-lora_unload
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|arc:challenge|25_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T03_39_03.666834
path:
- '**/details_harness|drop|3_2023-09-23T03-39-03.666834.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T03-39-03.666834.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T03_39_03.666834
path:
- '**/details_harness|gsm8k|5_2023-09-23T03-39-03.666834.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T03-39-03.666834.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hellaswag|10_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T16:36:13.785976.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T16:36:13.785976.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T16:36:13.785976.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T03_39_03.666834
path:
- '**/details_harness|winogrande|5_2023-09-23T03-39-03.666834.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T03-39-03.666834.parquet'
- config_name: results
data_files:
- split: 2023_08_02T16_36_13.785976
path:
- results_2023-08-02T16:36:13.785976.parquet
- split: 2023_09_23T03_39_03.666834
path:
- results_2023-09-23T03-39-03.666834.parquet
- split: latest
path:
- results_2023-09-23T03-39-03.666834.parquet
---
# Dataset Card for Evaluation run of Lajonbot/vicuna-7b-v1.5-PL-lora_unload
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Lajonbot/vicuna-7b-v1.5-PL-lora_unload
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Lajonbot/vicuna-7b-v1.5-PL-lora_unload](https://huggingface.co/Lajonbot/vicuna-7b-v1.5-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T03:39:03.666834](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__vicuna-7b-v1.5-PL-lora_unload/blob/main/results_2023-09-23T03-39-03.666834.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0065016778523489934,
"em_stderr": 0.0008230684297223919,
"f1": 0.06541946308724841,
"f1_stderr": 0.0015883719778429714,
"acc": 0.3959174184839032,
"acc_stderr": 0.009871427981667812
},
"harness|drop|3": {
"em": 0.0065016778523489934,
"em_stderr": 0.0008230684297223919,
"f1": 0.06541946308724841,
"f1_stderr": 0.0015883719778429714
},
"harness|gsm8k|5": {
"acc": 0.07202426080363912,
"acc_stderr": 0.007121147983537124
},
"harness|winogrande|5": {
"acc": 0.7198105761641673,
"acc_stderr": 0.012621707979798499
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gemgem104/test | ---
license: other
---
|
BENBENBENb/McTest640COT | ---
task_categories:
- question-answering
language:
- en
--- |
kaxap/pg-gpt4SQL-sql-instructions-1k | ---
license: cc-by-nc-4.0
---
The dataset is consructed by taking firsst 1000 rows of the train split of [pg-wikiSQL](https://huggingface.co/datasets/kaxap/pg-wikiSQL) dataset and asking GPT-4 to transform the query and the question to be more complex using various aggregate functions.
Resulting SQL statements were adapted for Postgres syntax and conventions.
Each SQL statement, including `CREATE TABLE` statements were syntax checked with [pgsanity](https://github.com/markdrago/pgsanity).
The `total_tokens` column indicates the OpenAI API usage for the datapoint generation. |
jxie/lipop | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: float64
splits:
- name: train_0
num_bytes: 200193
num_examples: 3360
- name: val_0
num_bytes: 24928
num_examples: 420
- name: test_0
num_bytes: 24770
num_examples: 420
- name: train_1
num_bytes: 199909
num_examples: 3360
- name: val_1
num_bytes: 25212
num_examples: 420
- name: test_1
num_bytes: 24770
num_examples: 420
- name: train_2
num_bytes: 200080
num_examples: 3360
- name: val_2
num_bytes: 24726
num_examples: 420
- name: test_2
num_bytes: 25085
num_examples: 420
download_size: 387383
dataset_size: 749673
---
# Dataset Card for "lipop"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yiming19/construction_test | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 109249653.0
num_examples: 13
download_size: 6553556
dataset_size: 109249653.0
---
# Dataset Card for "construction_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
romero61/test_merra_pm25 | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.