datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
jmbrito/b3-historical-quotes | ---
license: mit
tags:
- finance
- b3
- quotes
- historical
pretty_name: B3 Historical Quotes
size_categories:
- 1M<n<10M
---
# B3 Historical Quotes
<!-- Provide a quick summary of the dataset. -->
This dataset is a collection of historical quotes from the brazilian stock market(B3).
It contains historical quotes from all stocks in the country from Jan/2015 until Oct/2023.
## Dataset Details
All the data was retrieved as is from [B3 Historical Data](https://www.b3.com.br/en_us/market-data-and-indices/data-services/market-data/historical-data/equities/historical-quotes/)
and parsed to a csv. The columns are the same as the ones from the original content.
If you need more informations about the columns, it can be found in the [official b3 documentation](https://www.b3.com.br/en_us/market-data-and-indices/data-services/market-data/historical-data/equities/historical-quote-data/). |
bensonlinnnnn/train1 | ---
license: unknown
---
|
jignasha/medical | ---
license: mit
---
|
Fsg-15/pn_summary | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: article
dtype: string
- name: summary
dtype: string
- name: category
dtype:
class_label:
names:
'0': Economy
'1': Roads-Urban
'2': Banking-Insurance
'3': Agriculture
'4': International
'5': Oil-Energy
'6': Industry
'7': Transportation
'8': Science-Technology
'9': Local
'10': Sports
'11': Politics
'12': Art-Culture
'13': Society
'14': Health
'15': Research
'16': Education-University
'17': Tourism
- name: categories
dtype: string
- name: network
dtype:
class_label:
names:
'0': Tahlilbazaar
'1': Imna
'2': Shana
'3': Mehr
'4': Irna
'5': Khabaronline
- name: link
dtype: string
splits:
- name: train
num_bytes: 406448
num_examples: 100
- name: validation
num_bytes: 39018
num_examples: 10
- name: test
num_bytes: 39018
num_examples: 10
download_size: 282405
dataset_size: 484484
---
|
ovior/twitter_dataset_1713025671 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2510002
num_examples: 7661
download_size: 1419780
dataset_size: 2510002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-prehistory-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 8842
num_examples: 20
download_size: 9986
dataset_size: 8842
---
# Dataset Card for "mmlu-prehistory-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bouim/dvoice2 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: duration
dtype: float64
splits:
- name: train
num_bytes: 1459262910.208
num_examples: 2117
- name: test
num_bytes: 75535309.0
num_examples: 114
download_size: 1032875305
dataset_size: 1534798219.208
---
# Dataset Card for "dvoice2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_nq_train6000_eval6489_v1_doc_qa_random_permute | ---
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: all_docs_eval
num_bytes: 7125701
num_examples: 10925
- name: validation
num_bytes: 752802
num_examples: 6489
- name: train_qa
num_bytes: 697367
num_examples: 6000
- name: train_ic_qa
num_bytes: 4540536
num_examples: 6000
- name: train
num_bytes: 29202939
num_examples: 49700
- name: eval_ic_qa
num_bytes: 4906186
num_examples: 6489
- name: eval_recite_qa
num_bytes: 4912675
num_examples: 6489
- name: all_docs
num_bytes: 7126313
num_examples: 10925
- name: eval_qa
num_bytes: 752802
num_examples: 6489
- name: train_recite_qa
num_bytes: 4546536
num_examples: 6000
- name: first_permute_docs
num_bytes: 37615961
num_examples: 57692
- name: random_permute_docs
num_bytes: 28505572
num_examples: 43700
download_size: 42973377
dataset_size: 130685390
---
# Dataset Card for "lmind_nq_train6000_eval6489_v1_doc_qa_random_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lionelchg/dolly_classification | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1242435.7322097379
num_examples: 2029
- name: test
num_bytes: 65520.26779026217
num_examples: 107
download_size: 740864
dataset_size: 1307956.0
---
# Dataset Card for "dolly_classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
artemsnegirev/ru-word-games | ---
license: cc-by-4.0
language:
- ru
size_categories:
- 100K<n<1M
task_categories:
- text-generation
- text2text-generation
pretty_name: Word Games
---
## Dataset Summary
Dataset contains more than 100k examples of pairs word-description, where description is kind of crossword question. It could be useful for models that generate some description for a word, or try to a guess word from a description.
Source code for parsers and example of project are available [here](https://github.com/artemsnegirev/minibob)
Key stats:
- Number of examples: 133223
- Number of sources: 8
- Number of unique answers: 35024
| subset | count |
|--------------|-------|
| 350_zagadok | 350 |
| bashnya_slov | 43522 |
| crosswords | 39290 |
| guess_answer | 1434 |
| ostrova | 1526 |
| top_seven | 6643 |
| ugadaj_slova | 7406 |
| umnyasha | 33052 |
|
nlp-brin-id/fakenews-mafindo | ---
license: mit
task_categories:
- text-classification
language:
- id
size_categories:
- 10K<n<100K
---
Raw dataset for "Fact-Aware Fake-news Classification for Indonesian Language"</br></br>
<b>Disclaimer:</b> Beta version, contains imbalanced representation of domain-specific NON-HOAX samples. We will release a new training and evaluation suite soon as a replacement of this dataset. </br></br>
Data originates from https://turnbackhoax.id/ (Mafindo data 2018-2023); </br>
The attributes of data are: </br>
1. Label_id: Binary class labels ("HOAX"==1 ; "NON-HOAX"==0).</br>
2. Label: Binary class labels ("HOAX" or "NON-HOAX").</br>
3. Title: Claim or headline of news article.</br>
4. Title_cleaned: Preprocessed claim, by removing tag label at the beginning of the sentence.</br>
5. Content: the content of news article. </br>
6. Fact: The summary of factual evidence that is either supporting or contradicting the correponding claim.</br>
7. References: URL link of news article and the corresponding verdict or factual evidence as the justification of the news article.</br>
8. Classification: Fine-grained classification labels for the news article:</br> 'CekFakta', 'Fabricated Content', 'False Connection',
'False Context', 'Impostor Content', </br> 'Manipulated Content',
'Misleading Content', 'Satire', 'nan'.</br></br>
Example of usage:</br>
```python
>>> from datasets import load_dataset
>>> train_dataset = load_dataset(
... "nlp-brin-id/fakenews-id-brin",
... split="train",
... keep_default_na=False,
... ).select_columns(['Label_id', 'Title_cleaned', 'Content', 'Fact'])
``` |
jtatman/sciphi-mini-600m-unsloth-processed | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 232968067.11257112
num_examples: 26575
- name: val
num_bytes: 25887288.88742888
num_examples: 2953
download_size: 96906399
dataset_size: 258855356.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
YingJie0202/tech_to_proc | ---
dataset_info:
features:
- name: technique
dtype: string
- name: prompt
dtype: string
- name: procedure
dtype: string
splits:
- name: train
num_bytes: 19917628
num_examples: 14750
download_size: 1268928
dataset_size: 19917628
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PeacefulData/HyPoradise-v1-GigaSpeech | ---
license: mit
language_creators:
- expert-generated
task_categories:
- text-generation
tags:
- code
- Whisper-tiny
pretty_name: Whispering LLaLMA for new Hypotheses Paradise Subset
size_categories:
- 1k<n<10M
---
- If you consider this work would be related or useful for your research, please consider to cite the work in EMNLP 2023. Thank you.
```bib
@inproceedings{radhakrishnan2023whispering,
title={Whispering LLaMA: A Cross-Modal Generative Error Correction Framework for Speech Recognition},
author={Srijith Radhakrishnan, Chao-Han Huck Yang, Sumeer Ahmad Khan, Rohit Kumar, Narsis A. Kiani, David Gomez-Cabrero, Jesper N. Tegner},
booktitle={Proc. of EMNLP},
year={2023}
}
``` |
disi-unibo-nlp/medqa-MedGENIE | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: target
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: text
dtype: string
splits:
- name: train
num_bytes: 75592146
num_examples: 10178
- name: validation
num_bytes: 9526548
num_examples: 1272
- name: test
num_bytes: 9660480
num_examples: 1273
download_size: 5680157
dataset_size: 94779174
license: mit
task_categories:
- question-answering
language:
- en
tags:
- medical
---
# Dataset Card for "medqa-MedGENIE"
## Dataset Description
The data is a part of the MedGENIE collection of medical datasets augmented with artificial contexts generated by [PMC-LLaMA-13B](https://huggingface.co/axiong/PMC_LLaMA_13B). Specifically, up to 5 artificial contexts were generated for each question in [MedQA-USMLE](https://github.com/jind11/MedQA) (4 options), employing a multi-view approach to encompass various perspectives associated with the given question.
## Dataset Structure
The dataset has three splits, suitable for:
* Training *question-answering* models, including *fusion-in-decoder* architectures.
* Augmenting your LLMs during inference with generated contexts rather than retrived chunks.
* Augmening your knolwedge base of factual documents with generated contexts for standard RAG pipeline.
The number of examples per split is:
- **train:** 10178 samples
- **validation:** 1273 samples
- **test:** 1273 samples
The dataset is stored in parquet format with each entry using the following schema:
```
{
"id": 0,
"question": "A 23-year-old pregnant woman at 22 weeks gestation presents with burning upon urination. She states it started 1 day ago and has been worsening despite drinking more water and taking cranberry extract. She otherwise feels well and is followed by a doctor for her pregnancy. Her temperature is 97.7\u00b0F (36.5\u00b0C), blood pressure is 122/77 mmHg, pulse is 80/min, respirations are 19/min, and oxygen saturation is 98% on room air. Physical exam is notable for an absence of costovertebral angle tenderness and a gravid uterus. Which of the following is the best treatment for this patient?\nA. Ampicillin\nB. Ceftriaxone\nC. Doxycycline\nD. Nitrofurantoin",
"target": "D",
"answers": [
"D"
],
"ctxs": [
{
"text": "The burning upon urination in a pregnant female is often due to asymptomatic bacteriuria that results in a urinary tract infection (UTI). Such UTIs must be aggressively treated because of their association with preterm labor..."
},
{
"text": "This patient has urinary tract infection (UTI) symptoms, which is a common condition in pregnancy.\n- Nitrofurantoin and cephalexin are considered safe for use during pregnancy. Ceftriaxone and ampicillin can cross the placenta..."
},
{
"text": "Asymptomatic bacteriuria is defined as the presence of a positive urine culture in an asymptomatic patient. The most common complication from untreated asymptomatic bacteriuria is a UTI during pregnancy which can result in kidney..."
},
{
"text": "Asymptomatic bacteriuria is a frequent finding in pregnancy. Treatment is not recommended unless there are signs of an upper urinary tract infection, ie, fever (temperature >99\u00b0F/37\u00b0C), flank pain or tenderness, or pyuria... "
},
{
"text": "Asymptomatic bacteriuria is present if a patient has persistent (>2 weeks) bacteria in the urine as documented by a positive urine culture with no symptoms. In pregnancy, even if asymptomatic, bacteriuria increases the risk of pyelonephritis..."
}
]
}
```
## Augmenting LLMs during inference
Augmenting *state-of-the-art* LLMs with generated contexts from both **medqa-MedGENIE** and [medmcqa-MedGENIE](https://huggingface.co/datasets/disi-unibo-nlp/medmcqa-MedGENIE/blob/main/README.md) demonstrated a remarkable performance boost. For a given question, all relevant contexts are concatenated and passed within the context window of the LLM.
| Model | Learning|medqa-5-opt-MedGENIE |Accuracy |
|------|------|-----|-----|
| LLaMA-2-chat (7B)|2-shot | NO|36.9 |
| LLaMA-2-chat (7B)| 2-shot|YES |52.4 **(+ 15.5)** |
| Zephyr-β (7B)|2-shot|NO | 49.3 |
| Zephyr-β (7B)|2-shot| YES |59.7 **(+ 10.4)** |
## Evaluation for RAG
To assess the effectiveness of using our generated contexts for RAG pipeline, we augment the [MedWiki](https://huggingface.co/datasets/VOD-LM/medwiki) dataset with a smaller portion of artificially generated chunks derived from train and test sets of **medqa-MedGENIE** and [medmcqa-MedGENIE](https://huggingface.co/datasets/disi-unibo-nlp/medmcqa-MedGENIE).
| MedWiki chunks | Artificial chunks | Rerank | LLaMA-2-chat (7B) | mistral-instruct (7B) | Zephyr-β (7B) |
|------|-----|----------------|-------------------|-----------------------|---------------------|
| 4.5M | - | NO | 37.2 | 45.1 | 50.4 |
| 4.5M | 96K (only test)| NO | 40.2 **(+ 3.0)** | 44.9 | 50.5 **(+0.1)** |
| 4.5M | 2M (train + test)| NO | 40.8 **(+ 3.6)** | 44.4 | 51 **(+0.6)** |
| 4.5M | - | YES | 36.3 | 44.6 | 50.5 |
| 4.5M | 96K (only test)| YES | 41.4 **(+5.1)** | 45.6 **(+1.0)** | 50.8 **(+0.3)** |
| 4.5M | 2M (train + test)| YES | 40.5 **(+4.2)** | 45.9 **(+1.3)** | 51.2 **(+0.7)** |
## Citation
If you find this dataset is useful in your work, please cite it with:
```
@misc{frisoni2024generate,
title={To Generate or to Retrieve? On the Effectiveness of Artificial Contexts for Medical Open-Domain Question Answering},
author={Giacomo Frisoni and Alessio Cocchieri and Alex Presepi and Gianluca Moro and Zaiqiao Meng},
year={2024},
eprint={2403.01924},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
ghomasHudson/vlsp | ---
language:
- en
---
# Dataset Card for vlsp
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Needs More Information]
- **Repository:** https://github.com/ghomasHudson/very_long_scientific_papers
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
Dataset following the methodology of the scientific_papers dataset, but specifically designed for very long documents (>10,000 words). This is gathered from arxiv.org by searching for theses.
The dataset has 2 features:
- article: the body of the document.
- abstract: the abstract of the document.
### Supported Tasks and Leaderboards
Summarization
### Languages
English
## Dataset Structure
### Data Instances
[Needs More Information]
### Data Fields
[Needs More Information]
### Data Splits
Only a test set is provided.
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
[Needs More Information]
|
Xangal/Xangal | ---
license: openrail
---
|
distilled-from-one-sec-cv12/chunk_95 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1258956580
num_examples: 245315
download_size: 1285175963
dataset_size: 1258956580
---
# Dataset Card for "chunk_95"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bbc_hindi_nli | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- hi
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|bbc__hindi_news_classification
task_categories:
- text-classification
task_ids:
- natural-language-inference
pretty_name: BBC Hindi NLI Dataset
dataset_info:
config_name: bbc hindi nli
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': not-entailment
'1': entailment
- name: topic
dtype:
class_label:
names:
'0': india
'1': news
'2': international
'3': entertainment
'4': sport
'5': science
splits:
- name: train
num_bytes: 2990064
num_examples: 15552
- name: validation
num_bytes: 496800
num_examples: 2580
- name: test
num_bytes: 494424
num_examples: 2592
download_size: 309124
dataset_size: 3981288
configs:
- config_name: bbc hindi nli
data_files:
- split: train
path: bbc hindi nli/train-*
- split: validation
path: bbc hindi nli/validation-*
- split: test
path: bbc hindi nli/test-*
default: true
---
# Dataset Card for BBC Hindi NLI Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [GitHub](https://github.com/midas-research/hindi-nli-data)
- **Paper:** [Aclweb](https://www.aclweb.org/anthology/2020.aacl-main.71)
- **Point of Contact:** [GitHub](https://github.com/midas-research/hindi-nli-data)
### Dataset Summary
- Dataset for Natural Language Inference in Hindi Language. BBC Hindi Dataset consists of textual-entailment pairs.
- Each row of the Datasets if made up of 4 columns - Premise, Hypothesis, Label and Topic.
- Context and Hypothesis is written in Hindi while Entailment_Label is in English.
- Entailment_label is of 2 types - entailed and not-entailed.
- Dataset can be used to train models for Natural Language Inference tasks in Hindi Language.
[More Information Needed]
### Supported Tasks and Leaderboards
- Natural Language Inference for Hindi
### Languages
Dataset is in Hindi
## Dataset Structure
- Data is structured in TSV format.
- Train and Test files are in seperate files
### Dataset Instances
An example of 'train' looks as follows.
```
{'hypothesis': 'यह खबर की सूचना है|', 'label': 'entailed', 'premise': 'गोपनीयता की नीति', 'topic': '1'}
```
### Data Fields
- Each row contatins 4 columns - Premise, Hypothesis, Label and Topic.
### Data Splits
- Train : 15553
- Valid : 2581
- Test : 2593
## Dataset Creation
- We employ a recasting technique from Poliak et al. (2018a,b) to convert publicly available BBC Hindi news text classification datasets in Hindi and pose them as TE problems
- In this recasting process, we build template hypotheses for each class in the label taxonomy
- Then, we pair the original annotated sentence with each of the template hypotheses to create TE samples.
- For more information on the recasting process, refer to paper "https://www.aclweb.org/anthology/2020.aacl-main.71"
### Source Data
Source Dataset for the recasting process is the BBC Hindi Headlines Dataset(https://github.com/NirantK/hindi2vec/releases/tag/bbc-hindi-v0.1)
#### Initial Data Collection and Normalization
- BBC Hindi News Classification Dataset contains 4, 335 Hindi news headlines tagged across 14 categories: India, Pakistan,news, International, entertainment, sport, science, China, learning english, social, southasia, business, institutional, multimedia
- We processed this dataset to combine two sets of relevant but low prevalence classes.
- Namely, we merged the samples from Pakistan, China, international, and southasia as one class called international.
- Likewise, we also merged samples from news, business, social, learning english, and institutional as news.
- Lastly, we also removed the class multimedia because there were very few samples.
#### Who are the source language producers?
Pls refer to this paper: "https://www.aclweb.org/anthology/2020.aacl-main.71"
### Annotations
#### Annotation process
Annotation process has been described in Dataset Creation Section.
#### Who are the annotators?
Annotation is done automatically.
### Personal and Sensitive Information
No Personal and Sensitive Information is mentioned in the Datasets.
## Considerations for Using the Data
Pls refer to this paper: https://www.aclweb.org/anthology/2020.aacl-main.71
### Discussion of Biases
Pls refer to this paper: https://www.aclweb.org/anthology/2020.aacl-main.71
### Other Known Limitations
No other known limitations
## Additional Information
Pls refer to this link: https://github.com/midas-research/hindi-nli-data
### Dataset Curators
It is written in the repo : https://github.com/avinsit123/hindi-nli-data that
- This corpus can be used freely for research purposes.
- The paper listed below provide details of the creation and use of the corpus. If you use the corpus, then please cite the paper.
- If interested in commercial use of the corpus, send email to midas@iiitd.ac.in.
- If you use the corpus in a product or application, then please credit the authors and Multimodal Digital Media Analysis Lab - Indraprastha Institute of Information Technology, New Delhi appropriately. Also, if you send us an email, we will be thrilled to know about how you have used the corpus.
- Multimodal Digital Media Analysis Lab - Indraprastha Institute of Information Technology, New Delhi, India disclaims any responsibility for the use of the corpus and does not provide technical support. However, the contact listed above will be happy to respond to queries and clarifications.
- Rather than redistributing the corpus, please direct interested parties to this page
- Please feel free to send us an email:
- with feedback regarding the corpus.
- with information on how you have used the corpus.
- if interested in having us analyze your data for natural language inference.
- if interested in a collaborative research project.
### Licensing Information
Copyright (C) 2019 Multimodal Digital Media Analysis Lab - Indraprastha Institute of Information Technology, New Delhi (MIDAS, IIIT-Delhi).
Pls contact authors for any information on the dataset.
### Citation Information
```
@inproceedings{uppal-etal-2020-two,
title = "Two-Step Classification using Recasted Data for Low Resource Settings",
author = "Uppal, Shagun and
Gupta, Vivek and
Swaminathan, Avinash and
Zhang, Haimin and
Mahata, Debanjan and
Gosangi, Rakesh and
Shah, Rajiv Ratn and
Stent, Amanda",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.aacl-main.71",
pages = "706--719",
abstract = "An NLP model{'}s ability to reason should be independent of language. Previous works utilize Natural Language Inference (NLI) to understand the reasoning ability of models, mostly focusing on high resource languages like English. To address scarcity of data in low-resource languages such as Hindi, we use data recasting to create NLI datasets for four existing text classification datasets. Through experiments, we show that our recasted dataset is devoid of statistical irregularities and spurious patterns. We further study the consistency in predictions of the textual entailment models and propose a consistency regulariser to remove pairwise-inconsistencies in predictions. We propose a novel two-step classification method which uses textual-entailment predictions for classification task. We further improve the performance by using a joint-objective for classification and textual entailment. We therefore highlight the benefits of data recasting and improvements on classification performance using our approach with supporting experimental results.",
}
```
### Contributions
Thanks to [@avinsit123](https://github.com/avinsit123) for adding this dataset. |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-136000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 6147896
num_examples: 461
download_size: 285889
dataset_size: 6147896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Andaleciomusic/pirapora | ---
license: openrail
---
|
open-llm-leaderboard/details_facebook__xglm-564M | ---
pretty_name: Evaluation run of facebook/xglm-564M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [facebook/xglm-564M](https://huggingface.co/facebook/xglm-564M) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_facebook__xglm-564M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T23:39:39.394377](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__xglm-564M/blob/main/results_2023-10-15T23-39-39.394377.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.013422818791946308,\n\
\ \"em_stderr\": 0.0011784931108563684,\n \"f1\": 0.060359689597315525,\n\
\ \"f1_stderr\": 0.0017160396766447692,\n \"acc\": 0.2623842654231489,\n\
\ \"acc_stderr\": 0.007675207819463649\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.013422818791946308,\n \"em_stderr\": 0.0011784931108563684,\n\
\ \"f1\": 0.060359689597315525,\n \"f1_stderr\": 0.0017160396766447692\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.001312157814867416\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5224940805051302,\n \"acc_stderr\": 0.014038257824059881\n\
\ }\n}\n```"
repo_url: https://huggingface.co/facebook/xglm-564M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T23_39_39.394377
path:
- '**/details_harness|drop|3_2023-10-15T23-39-39.394377.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T23-39-39.394377.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T23_39_39.394377
path:
- '**/details_harness|gsm8k|5_2023-10-15T23-39-39.394377.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T23-39-39.394377.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:24:31.422133.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:24:31.422133.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:24:31.422133.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T23_39_39.394377
path:
- '**/details_harness|winogrande|5_2023-10-15T23-39-39.394377.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T23-39-39.394377.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_24_31.422133
path:
- results_2023-07-19T14:24:31.422133.parquet
- split: 2023_10_15T23_39_39.394377
path:
- results_2023-10-15T23-39-39.394377.parquet
- split: latest
path:
- results_2023-10-15T23-39-39.394377.parquet
---
# Dataset Card for Evaluation run of facebook/xglm-564M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/facebook/xglm-564M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [facebook/xglm-564M](https://huggingface.co/facebook/xglm-564M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_facebook__xglm-564M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T23:39:39.394377](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__xglm-564M/blob/main/results_2023-10-15T23-39-39.394377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.013422818791946308,
"em_stderr": 0.0011784931108563684,
"f1": 0.060359689597315525,
"f1_stderr": 0.0017160396766447692,
"acc": 0.2623842654231489,
"acc_stderr": 0.007675207819463649
},
"harness|drop|3": {
"em": 0.013422818791946308,
"em_stderr": 0.0011784931108563684,
"f1": 0.060359689597315525,
"f1_stderr": 0.0017160396766447692
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.001312157814867416
},
"harness|winogrande|5": {
"acc": 0.5224940805051302,
"acc_stderr": 0.014038257824059881
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/metatree_quake | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 67276
num_examples: 1529
- name: validation
num_bytes: 28556
num_examples: 649
download_size: 59604
dataset_size: 95832
---
# Dataset Card for "metatree_quake"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fouzan/coloring-book-test | ---
license: creativeml-openrail-m
---
|
Llamas-competition/public_validation_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
splits:
- name: train
num_bytes: 3903897.476190476
num_examples: 124
download_size: 3652052
dataset_size: 3903897.476190476
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arieg/cluster17_large_150 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '001075'
'1': '001703'
'2': 018043
'3': 020818
'4': 024418
'5': '024424'
'6': 026629
'7': 028481
'8': 035569
'9': 036986
'10': 036987
'11': 037781
'12': 038312
'13': 038363
'14': 039904
'15': '041605'
'16': '042375'
'17': 046158
'18': '046162'
'19': 047199
'20': '047201'
'21': 048861
'22': 049068
'23': '050323'
'24': 052862
'25': '054662'
'26': 055097
'27': 055809
'28': '056031'
'29': '057271'
'30': 057968
'31': 062448
'32': 062449
'33': '063747'
'34': 064896
'35': '066537'
'36': 066638
'37': 068893
'38': 068895
'39': 069206
'40': 069208
'41': 069222
'42': '071303'
'43': '072067'
'44': 072928
'45': '073123'
'46': '073125'
'47': '073340'
'48': 075926
'49': 076381
'50': 078847
'51': 080611
'52': 084198
'53': 084202
'54': 086040
'55': 089704
'56': 090530
'57': 091625
'58': 092573
'59': 097794
'60': 097813
'61': '107579'
'62': '108318'
'63': '110109'
'64': '110204'
'65': '110205'
'66': '110208'
'67': '110265'
'68': '111148'
'69': '112781'
'70': '115002'
'71': '119095'
'72': '119545'
'73': '120462'
'74': '121663'
'75': '122204'
'76': '123935'
'77': '124912'
'78': '126773'
'79': '127798'
'80': '130920'
'81': '131024'
'82': '131436'
'83': '132566'
'84': '134073'
'85': '135219'
'86': '136705'
'87': '137054'
'88': '137721'
'89': '138022'
'90': '140316'
'91': '140925'
'92': '140926'
'93': '141287'
'94': '142362'
'95': '148099'
'96': '148439'
'97': '155066'
splits:
- name: train
num_bytes: 748930382.0
num_examples: 14700
download_size: 753547762
dataset_size: 748930382.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
looppayments/question_answering_token_classification_2024_02_01 | ---
pretty_name: Question Answering Token Classification
---
Total train samples: 225237
Total test samples: 42026
Total tasks: 9
| Task | Train | Test |
| ---- | ----- | ---- |
|reference_number_association_without_question_boxes/2024-02-01|25000|5055|
|reference_numbers/2024-02-01|25006|5072|
|reference_number_association_with_question_boxes/2024-02-01|25000|5004|
|table_extraction_without_question_boxes/2024-02-01|25200|5128|
|table_cell_incremental_without_question_boxes/2024-02-01|25002|5127|
|table_cell_incremental_with_question_boxes/2024-02-01|25001|5005|
|table_header_with_question_boxes/2024-02-01|25007|5005|
|key_value/2024-02-01|25000|5000|
|label_and_location/2024-02-01|25021|1630|
Total artifact_qids: 38917
|
erfanzar/lmsys-lite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: conversation_id
dtype: string
- name: openai_moderation
list:
- name: categories
struct:
- name: harassment
dtype: bool
- name: harassment/threatening
dtype: bool
- name: hate
dtype: bool
- name: hate/threatening
dtype: bool
- name: self-harm
dtype: bool
- name: self-harm/instructions
dtype: bool
- name: self-harm/intent
dtype: bool
- name: sexual
dtype: bool
- name: sexual/minors
dtype: bool
- name: violence
dtype: bool
- name: violence/graphic
dtype: bool
- name: category_scores
struct:
- name: harassment
dtype: float64
- name: harassment/threatening
dtype: float64
- name: hate
dtype: float64
- name: hate/threatening
dtype: float64
- name: self-harm
dtype: float64
- name: self-harm/instructions
dtype: float64
- name: self-harm/intent
dtype: float64
- name: sexual
dtype: float64
- name: sexual/minors
dtype: float64
- name: violence
dtype: float64
- name: violence/graphic
dtype: float64
- name: flagged
dtype: bool
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
- name: list_conversation
sequence: string
- name: llama_2_prompt_style
dtype: string
splits:
- name: train
num_bytes: 3447659164
num_examples: 437224
download_size: 1688182571
dataset_size: 3447659164
---
# Dataset Card for "lmsys-lite"
This dataset is Lite Version of lmsys/lmsys-chat-1m and contains only english language and these models are filtered
- `gpt-3.5-turbo`
- `gpt-4`
- `llama-2-13b-chat`
- `llama-2-7b-chat`
- `mpt-30b-chat`
- `mpt-7b-chat`
- `palm-2`
- `vicuna-13b`
|
result-kand2-sdxl-wuerst-karlo/80da83b2 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 219
num_examples: 10
download_size: 1431
dataset_size: 219
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "80da83b2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marmofayezi/M3CollabDiff | ---
dataset_info:
features:
- name: id
dtype: string
- name: mask
dtype: image
- name: caption
dtype: string
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 845280773.0
num_examples: 2998
download_size: 845199921
dataset_size: 845280773.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SAGI-1/reasoningData_200k | ---
language:
- en
size_categories:
- 100K<n<1M
task_categories:
- text-generation
tags:
- reasoning
dataset_info:
features:
- name: instruction
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 211139066
num_examples: 201928
download_size: 125279312
dataset_size: 211139066
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
james-burton/vet_month_1d_all_text | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: age_at_consult
dtype: string
- name: Ear_or_Mastoid
dtype: string
- name: Mental_Behavioral_or_Neuro
dtype: string
- name: Blood_or_Blood-forming
dtype: string
- name: Circulatory
dtype: string
- name: Dental
dtype: string
- name: Developmental
dtype: string
- name: Digestive
dtype: string
- name: Endocrine_Nutritional_or_Metabolic
dtype: string
- name: Immune
dtype: string
- name: Infectious_or_Parasitic
dtype: string
- name: Skin
dtype: string
- name: Musculoskeletal_or_Connective_Tissue
dtype: string
- name: Neoplasms
dtype: string
- name: Nervous
dtype: string
- name: Visual
dtype: string
- name: Perinatal
dtype: string
- name: Pregnancy_Childbirth_or_Puerperium
dtype: string
- name: Respiratory
dtype: string
- name: Injury_Poisoning_or_External_Causes
dtype: string
- name: Genitourinary
dtype: string
- name: gender
dtype: string
- name: neutered
dtype: string
- name: species
dtype: string
- name: insured
dtype: string
- name: practice_id
dtype: string
- name: premise_id
dtype: string
- name: breed
dtype: string
- name: region
dtype: string
- name: record
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 5353930
num_examples: 8552
- name: validation
num_bytes: 946736
num_examples: 1510
- name: test
num_bytes: 1635039
num_examples: 2606
download_size: 4002909
dataset_size: 7935705
---
# Dataset Card for "vet_month_1d_all_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FlyingFishzzz/destination_test | ---
dataset_info:
features:
- name: target
dtype: image
- name: prompt
dtype: string
- name: landmarks
dtype: string
- name: condition
dtype: image
splits:
- name: train
num_bytes: 476552150.944
num_examples: 1588
download_size: 475421663
dataset_size: 476552150.944
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "destination_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mskov/misophonia_Sounds | ---
license: cc
language:
- en
pretty_name: misophoniaSounds
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 45800162.0
num_examples: 22
- name: test
num_bytes: 45351160.0
num_examples: 17
download_size: 65004669
dataset_size: 91151322.0
---
|
CyberHarem/lute_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lute/ルーテ (Fire Emblem)
This is the dataset of lute/ルーテ (Fire Emblem), containing 241 images and their tags.
The core tags of this character are `purple_hair, purple_eyes, breasts, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 241 | 249.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 241 | 151.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 493 | 280.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 241 | 225.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 493 | 374.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lute_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, solo, cleavage, simple_background, hair_flower, holding_book, navel, long_hair, medium_breasts, white_background, bare_shoulders, looking_at_viewer, bangs, closed_mouth, purple_bikini, collarbone, full_body, sandals |
| 1 | 8 |  |  |  |  |  | 1girl, bare_shoulders, dress, solo, cape, holding_book, simple_background, white_background, low_twintails, full_body, looking_at_viewer, short_hair, smile, upper_body |
| 2 | 21 |  |  |  |  |  | 1girl, navel, nipples, solo, collarbone, small_breasts, blush, completely_nude, pussy, looking_at_viewer, holding_book, standing, bangs, mosaic_censoring, medium_hair, full_body, open_mouth |
| 3 | 11 |  |  |  |  |  | 1girl, bare_shoulders, fur_trim, hat, long_sleeves, solo, bangs, official_alternate_costume, choker, flower, looking_at_viewer, twin_braids, boots, long_hair, open_mouth, simple_background, white_dress, white_footwear, christmas, closed_mouth, collarbone, food, white_background, full_body, holding, white_headwear |
| 4 | 5 |  |  |  |  |  | 1girl, completely_nude, hetero, mosaic_censoring, multiple_penises, nipples, solo_focus, blush, navel, on_back, 3boys, collarbone, cum_on_hair, facial, gangbang, medium_breasts, small_breasts, spread_legs, sweat, 2boys, bangs, bukkake, closed_eyes, cum_in_pussy, cum_on_breasts, double_handjob, ejaculation, hand_on_another's_head, heart, leg_grab, open_mouth, rape |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, nipples, open_mouth, sex, vaginal, blush, cum_in_pussy, medium_breasts, mosaic_censoring, nude, cowgirl_position, girl_on_top, oral |
| 6 | 5 |  |  |  |  |  | 1girl, blush, solo, tears, arms_behind_back, crotch_rope, nipples, pussy_juice, torn_clothes, white_panties, open_mouth, peeing_self, shibari_over_clothes, small_breasts, wet_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | simple_background | hair_flower | holding_book | navel | long_hair | medium_breasts | white_background | bare_shoulders | looking_at_viewer | bangs | closed_mouth | purple_bikini | collarbone | full_body | sandals | dress | cape | low_twintails | short_hair | smile | upper_body | nipples | small_breasts | blush | completely_nude | pussy | standing | mosaic_censoring | medium_hair | open_mouth | fur_trim | hat | long_sleeves | official_alternate_costume | choker | flower | twin_braids | boots | white_dress | white_footwear | christmas | food | holding | white_headwear | hetero | multiple_penises | solo_focus | on_back | 3boys | cum_on_hair | facial | gangbang | spread_legs | sweat | 2boys | bukkake | closed_eyes | cum_in_pussy | cum_on_breasts | double_handjob | ejaculation | hand_on_another's_head | heart | leg_grab | rape | 1boy | penis | sex | vaginal | nude | cowgirl_position | girl_on_top | oral | tears | arms_behind_back | crotch_rope | pussy_juice | torn_clothes | white_panties | peeing_self | shibari_over_clothes | wet_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:--------------|:---------------|:--------|:------------|:-----------------|:-------------------|:-----------------|:--------------------|:--------|:---------------|:----------------|:-------------|:------------|:----------|:--------|:-------|:----------------|:-------------|:--------|:-------------|:----------|:----------------|:--------|:------------------|:--------|:-----------|:-------------------|:--------------|:-------------|:-----------|:------|:---------------|:-----------------------------|:---------|:---------|:--------------|:--------|:--------------|:-----------------|:------------|:-------|:----------|:-----------------|:---------|:-------------------|:-------------|:----------|:--------|:--------------|:---------|:-----------|:--------------|:--------|:--------|:----------|:--------------|:---------------|:-----------------|:-----------------|:--------------|:-------------------------|:--------|:-----------|:-------|:-------|:--------|:------|:----------|:-------|:-------------------|:--------------|:-------|:--------|:-------------------|:--------------|:--------------|:---------------|:----------------|:--------------|:-----------------------|:--------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | | X | | X | | | | X | X | X | | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 21 |  |  |  |  |  | X | X | | | | X | X | | | | | X | X | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | X | | | | X | | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | X | | X | | | | X | | | X | | | | | | | | | X | X | X | X | | | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | | | | | | | | X | | X | | | | X | | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
dvilasuero/databricks-dolly-15k-es-deepl | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: instruction_en
dtype: string
- name: context_en
dtype: string
- name: response_en
dtype: string
splits:
- name: train
num_bytes: 25838910
num_examples: 15015
download_size: 16464221
dataset_size: 25838910
---
# Dataset Card for "databricks-dolly-15k-es-deepl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seenka/banners-jose | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': none
'1': videograph
'2': zocalo
- name: yolo_out
list:
- name: class
dtype: int64
- name: confidence
dtype: float64
- name: name
dtype: string
- name: xmax
dtype: float64
- name: xmin
dtype: float64
- name: ymax
dtype: float64
- name: ymin
dtype: float64
- name: cropped_image
dtype: image
- name: ocr_out
list:
- name: bbox
sequence:
sequence: float64
- name: confidence
dtype: float64
- name: text
dtype: string
- name: embeddings
sequence: float32
- name: embeddings_cropped
sequence: float32
- name: yolo_seenka_out
list:
- name: class
dtype: int64
- name: confidence
dtype: float64
- name: name
dtype: string
- name: xmax
dtype: float64
- name: xmin
dtype: float64
- name: ymax
dtype: float64
- name: ymin
dtype: float64
- name: yolo_filter_order
dtype: int64
- name: yolo_seenka_all_out
list:
- name: class
dtype: int64
- name: confidence
dtype: float64
- name: name
dtype: string
- name: xmax
dtype: float64
- name: xmin
dtype: float64
- name: ymax
dtype: float64
- name: ymin
dtype: float64
- name: yolo_filter_param
dtype: int64
- name: cropped_seenka_image
dtype: image
- name: yolo_sobel_out
list:
- name: class
dtype: int64
- name: confidence
dtype: float64
- name: name
dtype: string
- name: xmax
dtype: float64
- name: xmin
dtype: float64
- name: ymax
dtype: float64
- name: ymin
dtype: float64
splits:
- name: train
num_bytes: 455505874.375
num_examples: 1999
- name: test
num_bytes: 104043535.0
num_examples: 421
download_size: 561217652
dataset_size: 559549409.375
---
# Dataset Card for "banners-jose"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HumanCompatibleAI/ppo-seals-Hopper-v0 | ---
dataset_info:
features:
- name: obs
sequence:
sequence: float64
- name: acts
sequence:
sequence: float32
- name: infos
sequence: string
- name: terminal
dtype: bool
- name: rews
sequence: float64
splits:
- name: train
num_bytes: 54477160
num_examples: 104
download_size: 16464511
dataset_size: 54477160
---
# Dataset Card for "ppo-seals-Hopper-v0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v0.3 | ---
pretty_name: Evaluation run of saltlux/luxia-21.4b-alignment-v0.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [saltlux/luxia-21.4b-alignment-v0.3](https://huggingface.co/saltlux/luxia-21.4b-alignment-v0.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v0.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T19:24:25.613292](https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v0.3/blob/main/results_2024-03-11T19-24-25.613292.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6866698827821776,\n\
\ \"acc_stderr\": 0.031609908633197605,\n \"acc_norm\": 0.6863396372897556,\n\
\ \"acc_norm_stderr\": 0.03227681695490394,\n \"mc1\": 0.5630354957160343,\n\
\ \"mc1_stderr\": 0.017363844503195967,\n \"mc2\": 0.6943518929245227,\n\
\ \"mc2_stderr\": 0.0152631127664847\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7568259385665529,\n \"acc_stderr\": 0.012536554144587087,\n\
\ \"acc_norm\": 0.7627986348122867,\n \"acc_norm_stderr\": 0.012430399829260835\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.8125871340370444,\n\
\ \"acc_stderr\": 0.0038944505016930363,\n \"acc_norm\": 0.915255925114519,\n\
\ \"acc_norm_stderr\": 0.002779313023771229\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741716,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741716\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380042,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380042\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424385,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424385\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5079365079365079,\n \"acc_stderr\": 0.025748065871673297,\n \"\
acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.025748065871673297\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8419354838709677,\n\
\ \"acc_stderr\": 0.020752831511875267,\n \"acc_norm\": 0.8419354838709677,\n\
\ \"acc_norm_stderr\": 0.020752831511875267\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6009852216748769,\n \"acc_stderr\": 0.03445487686264716,\n\
\ \"acc_norm\": 0.6009852216748769,\n \"acc_norm_stderr\": 0.03445487686264716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.02554565042660362,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.02554565042660362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.023234581088428494,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.023234581088428494\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465708,\n\
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465708\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827948,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8642201834862385,\n \"acc_stderr\": 0.014686907556340013,\n \"\
acc_norm\": 0.8642201834862385,\n \"acc_norm_stderr\": 0.014686907556340013\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \
\ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n\
\ \"acc_stderr\": 0.028930413120910888,\n \"acc_norm\": 0.7533632286995515,\n\
\ \"acc_norm_stderr\": 0.028930413120910888\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.032484700838071943,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.032484700838071943\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924985,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924985\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n\
\ \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n\
\ \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.024185150647818714,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.024185150647818714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023132376234543346,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023132376234543346\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48891786179921776,\n\
\ \"acc_stderr\": 0.012767098998525843,\n \"acc_norm\": 0.48891786179921776,\n\
\ \"acc_norm_stderr\": 0.012767098998525843\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5630354957160343,\n\
\ \"mc1_stderr\": 0.017363844503195967,\n \"mc2\": 0.6943518929245227,\n\
\ \"mc2_stderr\": 0.0152631127664847\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8737174427782163,\n \"acc_stderr\": 0.009335559129908464\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6277482941622441,\n \
\ \"acc_stderr\": 0.013315375362565038\n }\n}\n```"
repo_url: https://huggingface.co/saltlux/luxia-21.4b-alignment-v0.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-24-25.613292.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-24-25.613292.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- '**/details_harness|winogrande|5_2024-03-11T19-24-25.613292.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T19-24-25.613292.parquet'
- config_name: results
data_files:
- split: 2024_03_11T19_24_25.613292
path:
- results_2024-03-11T19-24-25.613292.parquet
- split: latest
path:
- results_2024-03-11T19-24-25.613292.parquet
---
# Dataset Card for Evaluation run of saltlux/luxia-21.4b-alignment-v0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saltlux/luxia-21.4b-alignment-v0.3](https://huggingface.co/saltlux/luxia-21.4b-alignment-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T19:24:25.613292](https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v0.3/blob/main/results_2024-03-11T19-24-25.613292.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6866698827821776,
"acc_stderr": 0.031609908633197605,
"acc_norm": 0.6863396372897556,
"acc_norm_stderr": 0.03227681695490394,
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195967,
"mc2": 0.6943518929245227,
"mc2_stderr": 0.0152631127664847
},
"harness|arc:challenge|25": {
"acc": 0.7568259385665529,
"acc_stderr": 0.012536554144587087,
"acc_norm": 0.7627986348122867,
"acc_norm_stderr": 0.012430399829260835
},
"harness|hellaswag|10": {
"acc": 0.8125871340370444,
"acc_stderr": 0.0038944505016930363,
"acc_norm": 0.915255925114519,
"acc_norm_stderr": 0.002779313023771229
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741716,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741716
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380042,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380042
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424385,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424385
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.025748065871673297,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.025748065871673297
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8419354838709677,
"acc_stderr": 0.020752831511875267,
"acc_norm": 0.8419354838709677,
"acc_norm_stderr": 0.020752831511875267
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6009852216748769,
"acc_stderr": 0.03445487686264716,
"acc_norm": 0.6009852216748769,
"acc_norm_stderr": 0.03445487686264716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.02554565042660362,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.02554565042660362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7,
"acc_stderr": 0.023234581088428494,
"acc_norm": 0.7,
"acc_norm_stderr": 0.023234581088428494
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465708,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465708
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827948,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8642201834862385,
"acc_stderr": 0.014686907556340013,
"acc_norm": 0.8642201834862385,
"acc_norm_stderr": 0.014686907556340013
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7533632286995515,
"acc_stderr": 0.028930413120910888,
"acc_norm": 0.7533632286995515,
"acc_norm_stderr": 0.028930413120910888
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.032484700838071943,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.032484700838071943
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924985,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924985
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.024185150647818714,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.024185150647818714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023132376234543346,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023132376234543346
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48891786179921776,
"acc_stderr": 0.012767098998525843,
"acc_norm": 0.48891786179921776,
"acc_norm_stderr": 0.012767098998525843
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195967,
"mc2": 0.6943518929245227,
"mc2_stderr": 0.0152631127664847
},
"harness|winogrande|5": {
"acc": 0.8737174427782163,
"acc_stderr": 0.009335559129908464
},
"harness|gsm8k|5": {
"acc": 0.6277482941622441,
"acc_stderr": 0.013315375362565038
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
goendalf666/sales-conversations-instruction-customer | ---
dataset_info:
features:
- name: '0'
dtype: string
splits:
- name: train
num_bytes: 21867656
num_examples: 20927
download_size: 3900514
dataset_size: 21867656
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sales-conversations-instruction-customer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca | ---
pretty_name: Evaluation run of layoric/llama-2-13b-code-alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [layoric/llama-2-13b-code-alpaca](https://huggingface.co/layoric/llama-2-13b-code-alpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T08:33:30.933109](https://huggingface.co/datasets/open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca/blob/main/results_2023-09-17T08-33-30.933109.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n\
\ \"em_stderr\": 0.00044451099905589575,\n \"f1\": 0.06352139261744941,\n\
\ \"f1_stderr\": 0.001394404442569597,\n \"acc\": 0.4415195195231134,\n\
\ \"acc_stderr\": 0.010426765880718628\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905589575,\n\
\ \"f1\": 0.06352139261744941,\n \"f1_stderr\": 0.001394404442569597\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11902956785443518,\n \
\ \"acc_stderr\": 0.008919702911161632\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275625\n\
\ }\n}\n```"
repo_url: https://huggingface.co/layoric/llama-2-13b-code-alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|arc:challenge|25_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T08_33_30.933109
path:
- '**/details_harness|drop|3_2023-09-17T08-33-30.933109.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T08-33-30.933109.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T08_33_30.933109
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-33-30.933109.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-33-30.933109.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hellaswag|10_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:43:19.893957.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T14:43:19.893957.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T14:43:19.893957.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T08_33_30.933109
path:
- '**/details_harness|winogrande|5_2023-09-17T08-33-30.933109.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T08-33-30.933109.parquet'
- config_name: results
data_files:
- split: 2023_07_24T14_43_19.893957
path:
- results_2023-07-24T14:43:19.893957.parquet
- split: 2023_09_17T08_33_30.933109
path:
- results_2023-09-17T08-33-30.933109.parquet
- split: latest
path:
- results_2023-09-17T08-33-30.933109.parquet
---
# Dataset Card for Evaluation run of layoric/llama-2-13b-code-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/layoric/llama-2-13b-code-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [layoric/llama-2-13b-code-alpaca](https://huggingface.co/layoric/llama-2-13b-code-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T08:33:30.933109](https://huggingface.co/datasets/open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca/blob/main/results_2023-09-17T08-33-30.933109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589575,
"f1": 0.06352139261744941,
"f1_stderr": 0.001394404442569597,
"acc": 0.4415195195231134,
"acc_stderr": 0.010426765880718628
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589575,
"f1": 0.06352139261744941,
"f1_stderr": 0.001394404442569597
},
"harness|gsm8k|5": {
"acc": 0.11902956785443518,
"acc_stderr": 0.008919702911161632
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275625
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
lucadiliello/asnq | ---
dataset_info:
features:
- name: label
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: key
dtype: int64
splits:
- name: test
num_bytes: 87612019
num_examples: 466148
- name: dev
num_bytes: 87607015
num_examples: 463914
- name: train
num_bytes: 3814936393
num_examples: 20377568
download_size: 2602671423
dataset_size: 3990155427
---
# Dataset Card for "asnq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ignacioct/instruction_examples | ---
dataset_info:
- config_name: default
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 8969
num_examples: 10
download_size: 11445
dataset_size: 8969
- config_name: main
default:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 8969
num_examples: 10
download_size: 11445
dataset_size: 8969
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: main
data_files:
- split: train
path: main/train-*
---
|
smilerip/smileip | ---
license: other
---
|
UMAIR59/datasetllama | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
splits:
- name: train
num_bytes: 31386060
num_examples: 24895
download_size: 15599439
dataset_size: 31386060
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
result-muse256-muse512-wuerst-sdv15/3b801040 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 189
num_examples: 10
download_size: 1374
dataset_size: 189
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "3b801040"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713027427 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 10591
num_examples: 23
download_size: 9021
dataset_size: 10591
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713027427"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gabrielmbmb/prompts-collective-source | ---
language:
- en
dataset_info:
features:
- name: source
dtype: string
- name: kind
dtype: string
- name: evolved_from
dtype: string
- name: prompt
dtype: string
- name: embedding
sequence: float32
- name: distance_to_nn
dtype: float64
- name: nn_idx
dtype: int64
splits:
- name: train
num_bytes: 282891205
num_examples: 75983
download_size: 308596298
dataset_size: 282891205
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adamo1139/basic_economics_questions_ts_test_4 | ---
license: apache-2.0
---
|
0-hero/OIG-small-chip2 | ---
dataset_info:
features:
- name: user
dtype: string
- name: chip2
dtype: string
splits:
- name: train
num_bytes: 82154419
num_examples: 210289
download_size: 51736759
dataset_size: 82154419
task_categories:
- conversational
- text2text-generation
language:
- en
---
# Dataset Card for "OIG-small-chip2"
OIG-small-chip2 dataset from https://laion.ai/blog/oig-dataset/ <br>
Original Dataset - https://github.com/LAION-AI/Open-Instruction-Generalist |
teaguitos/bocade09 | ---
license: openrail
---
|
autoevaluate/autoeval-eval-tweet_eval-emotion-cb9f8a-66323145584 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- tweet_eval
eval_info:
task: multi_class_classification
model: ShoneRan/bert-emotion
metrics: []
dataset_name: tweet_eval
dataset_config: emotion
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: ShoneRan/bert-emotion
* Dataset: tweet_eval
* Config: emotion
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Lenague](https://huggingface.co/Lenague) for evaluating this model. |
NLP-ED/EduNER | ---
license: cc-by-4.0
---
# Educational named entity recognition dataset
1. EduNER is a Chinese named entity recognition dataset for education research.
2. More details about this dataset can be found at https://github.com/anonymous-xl/eduner, or read our paper.
### Reference
Li, X., Wei, C., Jiang, Z. et al. EduNER: a Chinese named entity recognition dataset for education research. Neural Comput & Applic (2023). https://doi.org/10.1007/s00521-023-08635-5 |
CyberHarem/ayane_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ayane/奥空アヤネ/绫音 (Blue Archive)
This is the dataset of ayane/奥空アヤネ/绫音 (Blue Archive), containing 238 images and their tags.
The core tags of this character are `black_hair, pointy_ears, short_hair, glasses, halo, red-framed_eyewear, hair_ornament, yellow_eyes, braid, breasts, red_halo, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 238 | 293.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayane_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 238 | 254.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayane_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 542 | 504.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ayane_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ayane_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, collared_shirt, open_jacket, school_uniform, solo, sweater_vest, upper_body, white_shirt, blazer, blue_necktie, looking_at_viewer, armband, flower, simple_background, blush, id_card, white_background, crown_braid, long_sleeves, open_mouth, brown_eyes, closed_mouth |
| 1 | 7 |  |  |  |  |  | 1girl, armband, black_skirt, blazer, blue_necktie, collared_shirt, long_sleeves, open_jacket, plaid_skirt, pleated_skirt, school_uniform, solo, sweater_vest, white_shirt, id_card, looking_at_viewer, simple_background, white_background, closed_mouth, cowboy_shot, holding, smile, blush, flower |
| 2 | 7 |  |  |  |  |  | 1girl, black_skirt, black_socks, blazer, blue_necktie, collared_shirt, long_sleeves, looking_at_viewer, open_jacket, plaid_skirt, pleated_skirt, school_uniform, solo, white_shirt, full_body, kneehighs, simple_background, smile, armband, closed_mouth, id_card, loafers, standing, white_background, black_footwear, blush, flower, holding_tablet_pc, yellow_sweater_vest |
| 3 | 48 |  |  |  |  |  | official_alternate_costume, striped_bikini, striped_clothes, 1girl, navel, solo, collarbone, looking_at_viewer, stomach, cleavage, white_jacket, blush, medium_breasts, open_jacket, short_shorts, long_sleeves, blue_bikini, denim_shorts, flower, open_mouth, smile, cowboy_shot, outdoors, bikini_top_only, front-tie_bikini_top, crown_braid |
| 4 | 9 |  |  |  |  |  | 1girl, simple_background, sleeveless_dress, white_background, solo, blush, closed_mouth, looking_at_viewer, white_dress, brown_eyes, garrison_cap, white_gloves, alternate_costume, bare_shoulders, china_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collared_shirt | open_jacket | school_uniform | solo | sweater_vest | upper_body | white_shirt | blazer | blue_necktie | looking_at_viewer | armband | flower | simple_background | blush | id_card | white_background | crown_braid | long_sleeves | open_mouth | brown_eyes | closed_mouth | black_skirt | plaid_skirt | pleated_skirt | cowboy_shot | holding | smile | black_socks | full_body | kneehighs | loafers | standing | black_footwear | holding_tablet_pc | yellow_sweater_vest | official_alternate_costume | striped_bikini | striped_clothes | navel | collarbone | stomach | cleavage | white_jacket | medium_breasts | short_shorts | blue_bikini | denim_shorts | outdoors | bikini_top_only | front-tie_bikini_top | sleeveless_dress | white_dress | garrison_cap | white_gloves | alternate_costume | bare_shoulders | china_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:-----------------|:-------|:---------------|:-------------|:--------------|:---------|:---------------|:--------------------|:----------|:---------|:--------------------|:--------|:----------|:-------------------|:--------------|:---------------|:-------------|:-------------|:---------------|:--------------|:--------------|:----------------|:--------------|:----------|:--------|:--------------|:------------|:------------|:----------|:-----------|:-----------------|:--------------------|:----------------------|:-----------------------------|:-----------------|:------------------|:--------|:-------------|:----------|:-----------|:---------------|:-----------------|:---------------|:--------------|:---------------|:-----------|:------------------|:-----------------------|:-------------------|:--------------|:---------------|:---------------|:--------------------|:-----------------|:--------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 48 |  |  |  |  |  | X | | X | | X | | | | | | X | | X | | X | | | X | X | X | | | | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | | X | | | | | | X | | | X | X | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
RIPL/TTIC-common | ---
license: cc-by-nc-4.0
---
|
autoevaluate/autoeval-eval-ccdv__arxiv-summarization-document-dcd037-2375274516 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- ccdv/arxiv-summarization
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-xl-16384-book-summary
metrics: ['bertscore', 'perplexity']
dataset_name: ccdv/arxiv-summarization
dataset_config: document
dataset_split: test
col_mapping:
text: article
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-xl-16384-book-summary
* Dataset: ccdv/arxiv-summarization
* Config: document
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
huggingartists/arash | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/arash"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.154835 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/de78420433126e9e426443d10bf22edf.600x600x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/arash">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Arash</div>
<a href="https://genius.com/artists/arash">
<div style="text-align: center; font-size: 14px;">@arash</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/arash).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/arash")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|105| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/arash")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
shossain/merged-no-pad-32768 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1226023649
num_examples: 3036
download_size: 337654761
dataset_size: 1226023649
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "merged-no-pad-32768"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carnival13/rbrt_hard_curr_uda_ep3 | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 744404183
num_examples: 519240
download_size: 242101139
dataset_size: 744404183
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rbrt_hard_curr_uda_ep3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kadarxwoody/artistic-2.0 | ---
license: artistic-2.0
---
|
hayesyang/diff_sitemap_and_direct | ---
dataset_info:
features:
- name: url
dtype: string
- name: sitemap
dtype: string
- name: local
dtype: string
- name: quick_ratio
dtype: float64
splits:
- name: zh
num_bytes: 74903836
num_examples: 2771
- name: en
num_bytes: 69187224
num_examples: 2258
- name: fr
num_bytes: 38867616
num_examples: 1201
- name: es
num_bytes: 56906331
num_examples: 1695
- name: ru
num_bytes: 35285827
num_examples: 926
- name: ar
num_bytes: 34554954
num_examples: 883
download_size: 84893570
dataset_size: 309705788
---
# Dataset Card for "diff_sitemap_and_direct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
peoples_daily_ner | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- zh
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: People's Daily NER
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
config_name: peoples_daily_ner
splits:
- name: train
num_bytes: 14972456
num_examples: 20865
- name: validation
num_bytes: 1676741
num_examples: 2319
- name: test
num_bytes: 3346975
num_examples: 4637
download_size: 8385672
dataset_size: 19996172
---
# Dataset Card for People's Daily NER
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/OYE93/Chinese-NLP-Corpus/tree/master/NER/People's%20Daily)
- **Repository:** [Github](https://github.com/OYE93/Chinese-NLP-Corpus/)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
No citation available for this dataset.
### Contributions
Thanks to [@JetRunner](https://github.com/JetRunner) for adding this dataset. |
fragom/full | ---
license: apache-2.0
---
|
terhdavid/proba_dataset-3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner
sequence:
class_label:
names:
'0': O
'1': B-ORG
'2': I-ORG
splits:
- name: train
num_bytes: 143190.77989130435
num_examples: 662
- name: test
num_bytes: 16006.220108695652
num_examples: 74
- name: validation
num_bytes: 16006.220108695652
num_examples: 74
download_size: 35415
dataset_size: 175203.22010869565
---
# Dataset Card for "proba_dataset-3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/league_faces | ---
dataset_info:
features:
- name: splash
dtype: image
- name: tile
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 36848634.0
num_examples: 419
download_size: 36050904
dataset_size: 36848634.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yjernite/prof_report__CompVis-stable-diffusion-v1-4__multi__24 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: cluster_size
dtype: int64
- name: img_ids
sequence: int64
- name: img_cluster_scores
sequence: float64
splits:
- name: accountant
num_bytes: 1768
num_examples: 7
- name: aerospace_engineer
num_bytes: 1840
num_examples: 10
- name: aide
num_bytes: 1792
num_examples: 8
- name: air_conditioning_installer
num_bytes: 1696
num_examples: 4
- name: architect
num_bytes: 1792
num_examples: 8
- name: artist
num_bytes: 1960
num_examples: 15
- name: author
num_bytes: 1792
num_examples: 8
- name: baker
num_bytes: 1656
num_examples: 9
- name: bartender
num_bytes: 1720
num_examples: 5
- name: bus_driver
num_bytes: 1912
num_examples: 13
- name: butcher
num_bytes: 1768
num_examples: 7
- name: career_counselor
num_bytes: 1768
num_examples: 7
- name: carpenter
num_bytes: 1744
num_examples: 6
- name: carpet_installer
num_bytes: 1720
num_examples: 5
- name: cashier
num_bytes: 1744
num_examples: 6
- name: ceo
num_bytes: 1680
num_examples: 10
- name: childcare_worker
num_bytes: 1816
num_examples: 9
- name: civil_engineer
num_bytes: 1720
num_examples: 5
- name: claims_appraiser
num_bytes: 1744
num_examples: 6
- name: cleaner
num_bytes: 1912
num_examples: 13
- name: clergy
num_bytes: 1792
num_examples: 8
- name: clerk
num_bytes: 1912
num_examples: 13
- name: coach
num_bytes: 1840
num_examples: 10
- name: community_manager
num_bytes: 1768
num_examples: 7
- name: compliance_officer
num_bytes: 1792
num_examples: 8
- name: computer_programmer
num_bytes: 1864
num_examples: 11
- name: computer_support_specialist
num_bytes: 1744
num_examples: 6
- name: computer_systems_analyst
num_bytes: 1888
num_examples: 12
- name: construction_worker
num_bytes: 1720
num_examples: 5
- name: cook
num_bytes: 1840
num_examples: 10
- name: correctional_officer
num_bytes: 1864
num_examples: 11
- name: courier
num_bytes: 1912
num_examples: 13
- name: credit_counselor
num_bytes: 1792
num_examples: 8
- name: customer_service_representative
num_bytes: 1792
num_examples: 8
- name: data_entry_keyer
num_bytes: 1768
num_examples: 7
- name: dental_assistant
num_bytes: 1720
num_examples: 5
- name: dental_hygienist
num_bytes: 1696
num_examples: 4
- name: dentist
num_bytes: 1840
num_examples: 10
- name: designer
num_bytes: 1888
num_examples: 12
- name: detective
num_bytes: 1792
num_examples: 8
- name: director
num_bytes: 1840
num_examples: 10
- name: dishwasher
num_bytes: 1864
num_examples: 11
- name: dispatcher
num_bytes: 1744
num_examples: 6
- name: doctor
num_bytes: 1816
num_examples: 9
- name: drywall_installer
num_bytes: 1672
num_examples: 3
- name: electrical_engineer
num_bytes: 1816
num_examples: 9
- name: electrician
num_bytes: 1720
num_examples: 5
- name: engineer
num_bytes: 1768
num_examples: 7
- name: event_planner
num_bytes: 1696
num_examples: 4
- name: executive_assistant
num_bytes: 1696
num_examples: 4
- name: facilities_manager
num_bytes: 1792
num_examples: 8
- name: farmer
num_bytes: 1648
num_examples: 2
- name: fast_food_worker
num_bytes: 1864
num_examples: 11
- name: file_clerk
num_bytes: 1864
num_examples: 11
- name: financial_advisor
num_bytes: 1720
num_examples: 5
- name: financial_analyst
num_bytes: 1792
num_examples: 8
- name: financial_manager
num_bytes: 1744
num_examples: 6
- name: firefighter
num_bytes: 1696
num_examples: 4
- name: fitness_instructor
num_bytes: 1720
num_examples: 5
- name: graphic_designer
num_bytes: 1840
num_examples: 10
- name: groundskeeper
num_bytes: 1744
num_examples: 6
- name: hairdresser
num_bytes: 1792
num_examples: 8
- name: head_cook
num_bytes: 1864
num_examples: 11
- name: health_technician
num_bytes: 1792
num_examples: 8
- name: industrial_engineer
num_bytes: 1768
num_examples: 7
- name: insurance_agent
num_bytes: 1816
num_examples: 9
- name: interior_designer
num_bytes: 1744
num_examples: 6
- name: interviewer
num_bytes: 1912
num_examples: 13
- name: inventory_clerk
num_bytes: 1864
num_examples: 11
- name: it_specialist
num_bytes: 1696
num_examples: 4
- name: jailer
num_bytes: 1816
num_examples: 9
- name: janitor
num_bytes: 1816
num_examples: 9
- name: laboratory_technician
num_bytes: 1888
num_examples: 12
- name: language_pathologist
num_bytes: 1816
num_examples: 9
- name: lawyer
num_bytes: 1768
num_examples: 7
- name: librarian
num_bytes: 1816
num_examples: 9
- name: logistician
num_bytes: 1864
num_examples: 11
- name: machinery_mechanic
num_bytes: 1744
num_examples: 6
- name: machinist
num_bytes: 1816
num_examples: 9
- name: maid
num_bytes: 1816
num_examples: 9
- name: manager
num_bytes: 1720
num_examples: 5
- name: manicurist
num_bytes: 1744
num_examples: 6
- name: market_research_analyst
num_bytes: 1816
num_examples: 9
- name: marketing_manager
num_bytes: 1744
num_examples: 6
- name: massage_therapist
num_bytes: 1744
num_examples: 6
- name: mechanic
num_bytes: 1720
num_examples: 5
- name: mechanical_engineer
num_bytes: 1792
num_examples: 8
- name: medical_records_specialist
num_bytes: 1792
num_examples: 8
- name: mental_health_counselor
num_bytes: 1816
num_examples: 9
- name: metal_worker
num_bytes: 1744
num_examples: 6
- name: mover
num_bytes: 1888
num_examples: 12
- name: musician
num_bytes: 1912
num_examples: 13
- name: network_administrator
num_bytes: 1624
num_examples: 1
- name: nurse
num_bytes: 1720
num_examples: 5
- name: nursing_assistant
num_bytes: 1696
num_examples: 4
- name: nutritionist
num_bytes: 1696
num_examples: 4
- name: occupational_therapist
num_bytes: 1744
num_examples: 6
- name: office_clerk
num_bytes: 1792
num_examples: 8
- name: office_worker
num_bytes: 1840
num_examples: 10
- name: painter
num_bytes: 1960
num_examples: 15
- name: paralegal
num_bytes: 1720
num_examples: 5
- name: payroll_clerk
num_bytes: 1768
num_examples: 7
- name: pharmacist
num_bytes: 1864
num_examples: 11
- name: pharmacy_technician
num_bytes: 1720
num_examples: 5
- name: photographer
num_bytes: 1864
num_examples: 11
- name: physical_therapist
num_bytes: 1792
num_examples: 8
- name: pilot
num_bytes: 1816
num_examples: 9
- name: plane_mechanic
num_bytes: 1744
num_examples: 6
- name: plumber
num_bytes: 1720
num_examples: 5
- name: police_officer
num_bytes: 1816
num_examples: 9
- name: postal_worker
num_bytes: 1816
num_examples: 9
- name: printing_press_operator
num_bytes: 1816
num_examples: 9
- name: producer
num_bytes: 1840
num_examples: 10
- name: psychologist
num_bytes: 1840
num_examples: 10
- name: public_relations_specialist
num_bytes: 1696
num_examples: 4
- name: purchasing_agent
num_bytes: 1864
num_examples: 11
- name: radiologic_technician
num_bytes: 1840
num_examples: 10
- name: real_estate_broker
num_bytes: 1744
num_examples: 6
- name: receptionist
num_bytes: 1672
num_examples: 3
- name: repair_worker
num_bytes: 1720
num_examples: 5
- name: roofer
num_bytes: 1696
num_examples: 4
- name: sales_manager
num_bytes: 1648
num_examples: 2
- name: salesperson
num_bytes: 1696
num_examples: 4
- name: school_bus_driver
num_bytes: 1960
num_examples: 15
- name: scientist
num_bytes: 1912
num_examples: 13
- name: security_guard
num_bytes: 1768
num_examples: 7
- name: sheet_metal_worker
num_bytes: 1720
num_examples: 5
- name: singer
num_bytes: 1984
num_examples: 16
- name: social_assistant
num_bytes: 1768
num_examples: 7
- name: social_worker
num_bytes: 1864
num_examples: 11
- name: software_developer
num_bytes: 1696
num_examples: 4
- name: stocker
num_bytes: 1864
num_examples: 11
- name: supervisor
num_bytes: 1768
num_examples: 7
- name: taxi_driver
num_bytes: 1792
num_examples: 8
- name: teacher
num_bytes: 1912
num_examples: 13
- name: teaching_assistant
num_bytes: 1864
num_examples: 11
- name: teller
num_bytes: 2008
num_examples: 17
- name: therapist
num_bytes: 1912
num_examples: 13
- name: tractor_operator
num_bytes: 1720
num_examples: 5
- name: truck_driver
num_bytes: 1696
num_examples: 4
- name: tutor
num_bytes: 1912
num_examples: 13
- name: underwriter
num_bytes: 1768
num_examples: 7
- name: veterinarian
num_bytes: 1744
num_examples: 6
- name: welder
num_bytes: 1696
num_examples: 4
- name: wholesale_buyer
num_bytes: 1816
num_examples: 9
- name: writer
num_bytes: 1816
num_examples: 9
download_size: 635630
dataset_size: 261408
---
# Dataset Card for "prof_report__CompVis-stable-diffusion-v1-4__multi__24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-72000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 653179
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
weqweasdas/preference_dataset_mix2 | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen_score
dtype: float64
- name: rejected_score
dtype: float64
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1797801737
num_examples: 528029
download_size: 1018650407
dataset_size: 1797801737
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "preference_dataset_mix2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-public_relations | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 4686
num_examples: 5
- name: test
num_bytes: 427274
num_examples: 110
download_size: 78594
dataset_size: 431960
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-public_relations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kabe_tomoe_soundeuphonium | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kabe Tomoe
This is the dataset of Kabe Tomoe, containing 52 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 52 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 118 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 52 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 52 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 52 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 52 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 52 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 118 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 118 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 118 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
keremberke/protective-equipment-detection | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
- Manufacturing
---
<div align="center">
<img width="640" alt="keremberke/protective-equipment-detection" src="https://huggingface.co/datasets/keremberke/protective-equipment-detection/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['glove', 'goggles', 'helmet', 'mask', 'no_glove', 'no_goggles', 'no_helmet', 'no_mask', 'no_shoes', 'shoes']
```
### Number of Images
```json
{'valid': 3570, 'test': 1935, 'train': 6473}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("keremberke/protective-equipment-detection", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/personal-protective-equipment/ppes-kaxsi/dataset/7](https://universe.roboflow.com/personal-protective-equipment/ppes-kaxsi/dataset/7?ref=roboflow2huggingface)
### Citation
```
@misc{ ppes-kaxsi_dataset,
title = { PPEs Dataset },
type = { Open Source Dataset },
author = { Personal Protective Equipment },
howpublished = { \\url{ https://universe.roboflow.com/personal-protective-equipment/ppes-kaxsi } },
url = { https://universe.roboflow.com/personal-protective-equipment/ppes-kaxsi },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { jul },
note = { visited on 2023-01-18 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.ai on July 7, 2022 at 3:49 PM GMT
It includes 11978 images.
Ppe-equipements are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
No image augmentation techniques were applied.
|
autoevaluate/autoeval-staging-eval-project-9a279865-5267-44c3-8be5-f8885af614f3-1715 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
nikesh66/Hatespeech-Dataset | ---
language:
- en
---
# Hate Speech Dataset
This dataset contains artificially genrated tweets alongwith its label whether it is hatespeech or not
## Dataset Description
- Number of Rows: 5,000
- Number of Columns: 2
- Column Names: 'Tweet', 'Hate Speech'
- Description: This dataset comprises tweets with annotations indicating whether they contain hate speech or not. Each row has a tweet and a binary label ('yes' or 'no') denoting the presence of hate speech. |
distilled-from-one-sec-cv12/chunk_270 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 822993180
num_examples: 160365
download_size: 837706216
dataset_size: 822993180
---
# Dataset Card for "chunk_270"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_drop_inf_to | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 5278
num_examples: 26
- name: test
num_bytes: 10225
num_examples: 38
- name: train
num_bytes: 39128
num_examples: 191
download_size: 27824
dataset_size: 54631
---
# Dataset Card for "MULTI_VALUE_wnli_drop_inf_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b | ---
pretty_name: Evaluation run of jsfs11/MoEv4Config-TestWeightedTIES-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jsfs11/MoEv4Config-TestWeightedTIES-7b](https://huggingface.co/jsfs11/MoEv4Config-TestWeightedTIES-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T02:02:25.718640](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b/blob/main/results_2024-02-13T02-02-25.718640.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563638437086499,\n\
\ \"acc_stderr\": 0.032002661093451415,\n \"acc_norm\": 0.6557415111591477,\n\
\ \"acc_norm_stderr\": 0.032674509311910384,\n \"mc1\": 0.5422276621787026,\n\
\ \"mc1_stderr\": 0.017440965712482125,\n \"mc2\": 0.708709362408976,\n\
\ \"mc2_stderr\": 0.014616149007167033\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726295,\n\
\ \"acc_norm\": 0.7158703071672355,\n \"acc_norm_stderr\": 0.013179442447653884\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6951802429794861,\n\
\ \"acc_stderr\": 0.004593902601979337,\n \"acc_norm\": 0.8818960366460864,\n\
\ \"acc_norm_stderr\": 0.0032207161266850255\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594626,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594626\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n\
\ \"acc_stderr\": 0.016542401954631917,\n \"acc_norm\": 0.42681564245810055,\n\
\ \"acc_norm_stderr\": 0.016542401954631917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537365,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537365\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5422276621787026,\n\
\ \"mc1_stderr\": 0.017440965712482125,\n \"mc2\": 0.708709362408976,\n\
\ \"mc2_stderr\": 0.014616149007167033\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292406\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7278241091736164,\n \
\ \"acc_stderr\": 0.01225971403516455\n }\n}\n```"
repo_url: https://huggingface.co/jsfs11/MoEv4Config-TestWeightedTIES-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|arc:challenge|25_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|gsm8k|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hellaswag|10_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T02-02-25.718640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T02-02-25.718640.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- '**/details_harness|winogrande|5_2024-02-13T02-02-25.718640.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T02-02-25.718640.parquet'
- config_name: results
data_files:
- split: 2024_02_13T02_02_25.718640
path:
- results_2024-02-13T02-02-25.718640.parquet
- split: latest
path:
- results_2024-02-13T02-02-25.718640.parquet
---
# Dataset Card for Evaluation run of jsfs11/MoEv4Config-TestWeightedTIES-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/MoEv4Config-TestWeightedTIES-7b](https://huggingface.co/jsfs11/MoEv4Config-TestWeightedTIES-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T02:02:25.718640](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b/blob/main/results_2024-02-13T02-02-25.718640.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6563638437086499,
"acc_stderr": 0.032002661093451415,
"acc_norm": 0.6557415111591477,
"acc_norm_stderr": 0.032674509311910384,
"mc1": 0.5422276621787026,
"mc1_stderr": 0.017440965712482125,
"mc2": 0.708709362408976,
"mc2_stderr": 0.014616149007167033
},
"harness|arc:challenge|25": {
"acc": 0.6860068259385665,
"acc_stderr": 0.013562691224726295,
"acc_norm": 0.7158703071672355,
"acc_norm_stderr": 0.013179442447653884
},
"harness|hellaswag|10": {
"acc": 0.6951802429794861,
"acc_stderr": 0.004593902601979337,
"acc_norm": 0.8818960366460864,
"acc_norm_stderr": 0.0032207161266850255
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594626,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594626
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.016542401954631917,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.016542401954631917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537365,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5422276621787026,
"mc1_stderr": 0.017440965712482125,
"mc2": 0.708709362408976,
"mc2_stderr": 0.014616149007167033
},
"harness|winogrande|5": {
"acc": 0.8382004735595896,
"acc_stderr": 0.010350128010292406
},
"harness|gsm8k|5": {
"acc": 0.7278241091736164,
"acc_stderr": 0.01225971403516455
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
wise-east/spolin | ---
annotations_creators:
- crowdsourced
language_creators:
- expert-generated
- other
language:
- en
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
pretty_name: spolin
size_categories:
- 100K<n<1M
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
- text-generation
task_ids:
- text-scoring
- dialogue-modeling
---
# SPOLIN
[![CC BY-NC 4.0][cc-by-nc-shield]][cc-by-nc]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Available SPOLIN Versions](#available_spolin_versions)
- [Relevant Links](#relevant-links)
- [Dataset Structure](#dataset-structure)
- [Dataset Statistics](#dataset-statistics)
- [Other Information](#other-information)
- [ACL Presentation](#acl-presentation)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
### Dataset Summary
This is the repo for the paper ["Grounding Conversations with Improvised Dialogues"](https://aclanthology.org/2020.acl-main.218/) (ACL2020).
The _Selected Pairs of Learnable ImprovisatioN_ (SPOLIN) corpus is a collection of more than 68,000 "Yes, and" type dialogue pairs extracted from the Spontaneanation podcast by Paul F. Tompkins, the Cornell Movie-Dialogs Corpus, and the SubTle corpus. For more information, refer to our [paper](https://arxiv.org/abs/2004.09544) or our [project page](https://justin-cho.com/spolin).
### Available SPOLIN Versions:
The core dataset that was used for the experiments in the paper only includes _yes-ands_ and non-_yes-ands_ from Spontaneanation and most of what is provided in those extracted from the Cornell Movie-Dialogs Corpus. After the submitting the paper, we continued our iterative data augmentation process, repeating another iteration with the Cornell Movie-Dialogs Corpus and extracting from the SubTle corpus. This expanded version is also included in this repository [here](/data). This latest version of SPOLIN was used to train the model used in our [demo](https://spolin.isi.edu).
In the `data` folder, we provide two versions of the SPOLIN training set:
1. Version used for experiments in the ACL paper: `data/spolin-train-acl.csv`
2. Expanded version: `data/spolin-train.csv`
### Relevant Links:
* Project page: https://justin-cho.com/spolin
* Github repo: https://github.com/wise-east/spolin
* SpolinBot Demo: https://spolin.isi.edu
* ACL2020 Paper: https://aclanthology.org/2020.acl-main.218/
## Dataset Structure
**Fields**
* `id`: unique identifier
* `prompt`: first utterance in utterance pair
* `response`: second utterance in utterance pair
* `label`: yesand = 1, non-yesand = 0
* `source`: the source for the sample
* `split`: whether the sample belongs to the training set or the validation set
## Dataset Statistics
##### `spolin-train.csv`:
|| yesands| non-yesands|
|--|---:|---:|
|Spontaneanation|10,459|5,587*|
|Cornell|16,426|18,310|
|SubTle|40,303|19,512|
|Total|67,188|43,409|
##### `spolin-train-acl.csv`:
|| yesands| non-yesands|
|--|---:|---:|
|Spontaneanation|10,459|5,587*|
|Cornell|14,976|17,851|
|Total|25,435|23,438|
##### `spolin-valid.csv`:
|| yesands| non-yesands|
|--|---:|---:|
|Spontaneanation|500|500*|
|Cornell|500|500|
|Total|1,000|1,000|
\*Artificially collected by mix & matching positive Spontaneanation samples to balance dataset for training classifier
## Other Information
### ACL Presentation
[Video recording](https://slideslive.com/38928948/grounding-conversations-with-improvised-dialogues)
### Citation Information
If you use our data for your work, please cite our ACL2020 paper:
```
@inproceedings{cho2020spolin,
title={Grounding Conversations with Improvised Dialogues},
author={Cho, Hyundong and May, Jonathan},
booktitle ={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
publisher = {Association for Computational Linguistics},
location = {Seattle, Washington, USA},
year={2020}
}
```
### Licensing Information
This work is licensed under a [Creative Commons Attribution-NonCommercial 4.0 International License][cc-by-nc].
[![CC BY-NC 4.0][cc-by-nc-image]][cc-by-nc]
[cc-by-nc]: http://creativecommons.org/licenses/by-nc/4.0/
[cc-by-nc-image]: https://licensebuttons.net/l/by-nc/4.0/88x31.png
[cc-by-nc-shield]: https://img.shields.io/badge/License-CC%20BY--NC%204.0-lightgrey.svg
|
CyberHarem/murata_himeko_honkai3 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of murata_himeko (Houkai 3rd)
This is the dataset of murata_himeko (Houkai 3rd), containing 500 images and their tags.
The core tags of this character are `red_hair, bangs, yellow_eyes, breasts, large_breasts, long_hair, mole, mole_on_breast`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 719.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 381.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1171 | 803.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 619.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1171 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/murata_himeko_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, bare_shoulders, solo, wedding_dress, white_dress, bridal_veil, bride, red_rose, smile, cleavage, hair_flower, looking_at_viewer, white_gloves, closed_mouth, petals, holding, simple_background, elbow_gloves, sleeveless, white_background, white_thighhighs |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, closed_mouth, looking_at_viewer, solo, cleavage, hair_ornament, smile, earrings, holding_sword, red_gloves |
| 2 | 10 |  |  |  |  |  | 1girl, solo, black_gloves, boots, black_shorts, cleavage, red_jacket, thighhighs, belt, closed_mouth, holding_sword, sleeves_rolled_up, looking_at_viewer, smile, fire, aiguillette, cropped_jacket, full_body, short_shorts |
| 3 | 5 |  |  |  |  |  | 1girl, cleavage, closed_mouth, looking_at_viewer, simple_background, smile, solo, white_background, black_gloves, forehead, red_jacket |
| 4 | 11 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, smile, bare_shoulders, closed_mouth, lipstick, forehead, simple_background, white_background, hair_ornament, china_dress, red_dress |
| 5 | 16 |  |  |  |  |  | black_bikini, cleavage, looking_at_viewer, smile, 1girl, solo, closed_mouth, sleeves_rolled_up, white_shirt, black_choker, navel, one_eye_closed, simple_background, alcohol, see-through, side-tie_bikini_bottom, sitting |
| 6 | 17 |  |  |  |  |  | 1boy, hetero, penis, 1girl, open_mouth, blush, nipples, looking_at_viewer, dark-skinned_male, solo_focus, mosaic_censoring, navel, pussy, sweat, completely_nude, spread_legs, tongue_out, ass, cum, indoors, parted_bangs, sex_from_behind, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | wedding_dress | white_dress | bridal_veil | bride | red_rose | smile | cleavage | hair_flower | looking_at_viewer | white_gloves | closed_mouth | petals | holding | simple_background | elbow_gloves | sleeveless | white_background | white_thighhighs | hair_ornament | earrings | holding_sword | red_gloves | black_gloves | boots | black_shorts | red_jacket | thighhighs | belt | sleeves_rolled_up | fire | aiguillette | cropped_jacket | full_body | short_shorts | forehead | lipstick | china_dress | red_dress | black_bikini | white_shirt | black_choker | navel | one_eye_closed | alcohol | see-through | side-tie_bikini_bottom | sitting | 1boy | hetero | penis | open_mouth | blush | nipples | dark-skinned_male | solo_focus | mosaic_censoring | pussy | sweat | completely_nude | spread_legs | tongue_out | ass | cum | indoors | parted_bangs | sex_from_behind | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:----------------|:--------------|:--------------|:--------|:-----------|:--------|:-----------|:--------------|:--------------------|:---------------|:---------------|:---------|:----------|:--------------------|:---------------|:-------------|:-------------------|:-------------------|:----------------|:-----------|:----------------|:-------------|:---------------|:--------|:---------------|:-------------|:-------------|:-------|:--------------------|:-------|:--------------|:-----------------|:------------|:---------------|:-----------|:-----------|:--------------|:------------|:---------------|:--------------|:---------------|:--------|:-----------------|:----------|:--------------|:-------------------------|:----------|:-------|:---------|:--------|:-------------|:--------|:----------|:--------------------|:-------------|:-------------------|:--------|:--------|:------------------|:--------------|:-------------|:------|:------|:----------|:---------------|:------------------|:----------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | | | | | X | X | | X | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | X | | | | | | X | X | | X | | X | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | | | | | | X | X | | X | | X | | | X | | | X | | | | | | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | X | X | | | | | | X | X | | X | | X | | | X | | | X | | X | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 16 |  |  |  |  |  | X | | X | | | | | | X | X | | X | | X | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 6 | 17 |  |  |  |  |  | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
aidiary/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
dtype: int64
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 20277330
num_examples: 6617
download_size: 4947918
dataset_size: 20277330
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
christti/squad-augmented-v2 | ---
pretty_name: SQuAD Augmented v2
license: cc-by-4.0
task_categories:
- question-answering
source_datasets:
- extended|wikipedia
task_ids:
- extractive-qa
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
- found
paperswithcode_id: squad
language:
- en
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
viewer: true
train-eval-index:
- config: plain_text
task: question-answering
task_id: extractive_question_answering
splits:
train_split: train
eval_split: validation
col_mapping:
question: question
context: context
answers:
text: text
answer_start: answer_start
metrics:
- type: squad
name: SQuAD
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
config_name: plain_text
splits:
- name: train
num_bytes: 156093315
num_examples: 169211
- name: validation
num_bytes: 10472653
num_examples: 10570
download_size: 35142551
dataset_size: 89789763
--- |
huggingartists/macan | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/macan"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.098787 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/9c2f93bf9d29964df4d9d5f41089a2b5.976x976x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/macan">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">MACAN</div>
<a href="https://genius.com/artists/macan">
<div style="text-align: center; font-size: 14px;">@macan</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/macan).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/macan")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|27| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/macan")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
asimsultan/cyber2k | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 80124909
num_examples: 65232
download_size: 18992332
dataset_size: 80124909
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-sociology-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 6919.6567164179105
num_examples: 21
download_size: 7565
dataset_size: 6919.6567164179105
---
# Dataset Card for "mmlu-sociology-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lhoestq/digiface1m_720k | ---
license: other
---
|
jhu-clsp/seamless-align | ---
license: mit
task_categories:
- translation
- audio-to-audio
language:
- mt
- en
- cy
- te
- kn
- be
- ta
- uz
- tg
- ca
- ur
- zh
- th
- ko
- hi
- da
- cs
- vi
- sw
- rn
- uk
- tr
- ar
- id
- fi
- sk
- sv
- pl
- it
- pt
- ru
- de
- nl
- fr
---
# Dataset Card for Seamless-Align (WIP). Inspired by https://huggingface.co/datasets/allenai/nllb
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Needs More Information]
- **Repository:** [Needs More Information]
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
This dataset was created based on [metadata](https://github.com/facebookresearch/seamless_communication/blob/main/docs/m4t/seamless_align_README.md) for mined Speech-to-Speech(S2S), Text-to-Speech(TTS) and Speech-to-Text(S2T) released by Meta AI. The S2S contains data for 35 language pairs. The S2S dataset is ~1000GB compressed.
#### How to use the data
There are two ways to access the data:
* Via the Hugging Face Python datasets library
```
Scripts coming soon
```
* Clone the git repo
```
git lfs install
git clone https://huggingface.co/datasets/jhu-clsp/seamless-align
```
### Supported Tasks and Leaderboards
N/A
### Languages
Language pairs can be found [here](https://github.com/facebookresearch/seamless_communication/blob/main/docs/m4t/seamless_align_README.md).
## Dataset Structure
The S2S dataset contains two gzipped files src.tar.gz annd tgt.tar.gz
### Data Instances
The number of instances for each language pair can be found in the [dataset_infos.json](https://huggingface.co/datasets/allenai/nllb/blob/main/dataset_infos.json) file.
### Data Fields
Data Field can be found [here](https://github.com/facebookresearch/seamless_communication/blob/main/docs/m4t/seamless_align_README.md).
### Data Splits
The data is not split.
## Dataset Creation
### Curation Rationale
### Source Data
Inspect links in metadata
#### Who are the source language producers?
Speech and Text was collected from the web many of which are web crawls.
### Annotations
#### Annotation process
Parallel sentences were identified using SONAR encoders. (Duquenne et al., 2023)
#### Who are the annotators?
The data was not human annotated.
### Personal and Sensitive Information
Data may contain personally identifiable information, sensitive content, or toxic content that was publicly shared on the Internet.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset provides data for training machine learning systems for many languages.
### Discussion of Biases
Biases in the data have not been specifically studied, however as the original source of data is World Wide Web it is likely that the data has biases similar to those prevalent in the Internet. The data may also exhibit biases introduced by language identification and data filtering techniques; lower resource languages generally have lower accuracy.
### Other Known Limitations
Some of the translations are in fact machine translations. While some website machine translation tools are identifiable from HTML source, these tools were not filtered out en mass because raw HTML was not available from some sources and CommonCrawl processing started from WET files.
## Additional Information
### Dataset Curators
The data was not curated.
### Licensing Information
The dataset is released under the terms of [MIT](https://opensource.org/license/mit/). **PLEASE, USE DATA RESPONSIBLY**
### Citation Information
Seamless Communication et al, SeamlessM4T: Massively Multilingual & Multimodal Machine Translation. arXiv https://arxiv.org/abs/2308.11596, 2023. <br>
Duquenne et al, SONAR: Sentence-Level Multimodal and Language-Agnostic Representations. arXiv https://arxiv.org/abs/2308.11466, 2023
### Contributions
We thank the Seamless Communication Meta AI team for open sourcing the meta data and instructions on how to use it with special thanks to Loïc Barrault, Yu-An Chung, Mariano Cora Meglioli, David Dale, Ning Dong, Paul-Ambroise Duquenne, Hady Elsahar, Hongyu Gong, Kevin Heffernan, John Hoffman, Christopher Klaiber, Pengwei Li, Daniel Licht, Jean Maillard, Alice Rakotoarison, Kaushik Ram Sadagopan, Guillaume Wenzek, Ethan Ye, Bapi Akula, Peng-Jen Chen, Naji El Hachem, Brian Ellis, Gabriel Mejia Gonzalez, Justin Haaheim, Prangthip Hansanti, Russ Howes, Bernie Huang, Min-Jae Hwang, Hirofumi Inaguma, Somya Jain, Elahe Kalbassi, Amanda Kallet, Ilia Kulikov, Janice Lam, Daniel Li, Xutai Ma, Ruslan Mavlyutov, Benjamin Peloquin, Mohamed Ramadan, Abinesh Ramakrishnan, Anna Sun, Kevin Tran, Tuan Tran, Igor Tufanov, Vish Vogeti, Carleigh Wood, Yilin Yang, Bokai Yu, Pierre Andrews, Can Balioglu, Marta R. Costa-jussà, Onur Celebi, Maha Elbayad, Cynthia Gao, Francisco Guzmán, Justine Kao, Ann Lee, Alexandre Mourachko, Juan Pino, Sravya Popuri, Christophe Ropers, Safiyyah Saleem, Holger Schwenk, Paden Tomasello, Changhan Wang, Jeff Wang, Skyler Wang. We also thank the Center for Language and Speech Processing(CLSP) for hosting and releasing this data, including Bismarck Bamfo Odoom and Philipp Koehn (for engineering efforts to host the data, and releasing the huggingface dataset), and Alexandre Mourachko (for organizing the connection). |
Dimi446/siivagunner-style | ---
license: apache-2.0
---
|
moonmoon-Flytomoon/LSD | ---
license: other
license_name: other
license_link: LICENSE
---
|
manojpatil/123 | ---
dataset_info:
features:
- name: r
dtype: int64
- name: theta
dtype: string
splits:
- name: train
num_bytes: 173
num_examples: 7
download_size: 1415
dataset_size: 173
---
# Dataset Card for "123"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Arziva/biorxiv | ---
license: mit
---
|
imomayiz/darija-english | ---
language:
- ar
- en
license: cc
task_categories:
- translation
configs:
- config_name: sentences
data_files:
- split: sentences
path: sentences.csv
sep: ","
- config_name: submissions
data_files:
- split: submissions
path: submissions/submissions*.json
---
This work is part of [DODa](https://darija-open-dataset.github.io/).
|
NMiriams/Defective_Tires | ---
task_categories:
- feature-extraction
- image-classification
tags:
- tires
- automotives
- defective-tires
pretty_name: defective-tires
size_categories:
- 1K<n<10K
license: cc-by-4.0
--- |
kye/pytorch-repo-code | ---
license: mit
---
|
noob123/small_augemented_nlp_dataset | ---
license: other
---
|
arubenruben/portuguese-language-identification-raw | ---
dataset_info:
- config_name: journalistic
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 1312620204.0
num_examples: 1845205
download_size: 869968625
dataset_size: 1312620204.0
- config_name: legal
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1338097227.0
num_examples: 5211975
download_size: 821524458
dataset_size: 1338097227.0
- config_name: literature
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 33472546
num_examples: 82744
download_size: 21387497
dataset_size: 33472546
- config_name: politics
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 64856376.0
num_examples: 47344
download_size: 37697313
dataset_size: 64856376.0
- config_name: social_media
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 372374266.0
num_examples: 3074774
download_size: 267382814
dataset_size: 372374266.0
- config_name: web
features:
- name: text
dtype: string
- name: domain
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 915101054.0
num_examples: 200000
download_size: 485541943
dataset_size: 915101054.0
configs:
- config_name: journalistic
data_files:
- split: train
path: journalistic/train-*
- config_name: legal
data_files:
- split: train
path: legal/train-*
- config_name: literature
data_files:
- split: train
path: literature/train-*
- config_name: politics
data_files:
- split: train
path: politics/train-*
- config_name: social_media
data_files:
- split: train
path: social_media/train-*
- config_name: web
data_files:
- split: train
path: web/train-*
---
|
malteos/test2 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- conditional-text-generation
task_ids:
- summarization
paperswithcode_id: cnn-daily-mail-1
pretty_name: CNN / Daily Mail
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
h2oai/h2ogpt-oig-oasst1-instruct-cleaned-v1 | ---
license: apache-2.0
language:
- en
thumbnail: https://h2o.ai/etc.clientlibs/h2o/clientlibs/clientlib-site/resources/images/favicon.ico
tags:
- gpt
- llm
- large language model
- open-source
---
# h2oGPT Data Card
## Summary
H2O.ai's `h2ogpt-oig-oasst1-instruct-cleaned-v1` is an open-source instruct-type dataset for fine-tuning of large language models, licensed for commercial use.
- Number of rows: `349837`
- Number of columns: `3`
- Column names: `['input', 'source', 'prompt_type']`
## Source
- [Original LAION OIG Dataset](https://github.com/LAION-AI/Open-Instruction-Generalist)
- [LAION OIG data detoxed and filtered down by scripts in h2oGPT repository](https://github.com/h2oai/h2ogpt/blob/main/FINETUNE.md#high-quality-oig-based-instruct-data)
- [Original Open Assistant data in tree structure](https://huggingface.co/datasets/OpenAssistant/oasst1)
- [This flattened dataset created by script in h2oGPT repository](https://github.com/h2oai/h2ogpt/blob/5fc91911bc2bfaaf3b6c2de577c4b0ae45a07a4a/create_data.py#L1253)
|
chiyuxing/cyx-dataset | ---
license: bsd
---
|
open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-v1.2 | ---
pretty_name: Evaluation run of YeungNLP/firefly-llama2-13b-v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-llama2-13b-v1.2](https://huggingface.co/YeungNLP/firefly-llama2-13b-v1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-v1.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T22:16:40.042920](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-v1.2/blob/main/results_2023-09-16T22-16-40.042920.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1929530201342282,\n\
\ \"em_stderr\": 0.004041241925899649,\n \"f1\": 0.28937080536912874,\n\
\ \"f1_stderr\": 0.004092108997164026,\n \"acc\": 0.43286870958302937,\n\
\ \"acc_stderr\": 0.010534410178374885\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1929530201342282,\n \"em_stderr\": 0.004041241925899649,\n\
\ \"f1\": 0.28937080536912874,\n \"f1_stderr\": 0.004092108997164026\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \
\ \"acc_stderr\": 0.008870331256489991\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.01219848910025978\n\
\ }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-llama2-13b-v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|arc:challenge|25_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T22_16_40.042920
path:
- '**/details_harness|drop|3_2023-09-16T22-16-40.042920.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T22-16-40.042920.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T22_16_40.042920
path:
- '**/details_harness|gsm8k|5_2023-09-16T22-16-40.042920.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T22-16-40.042920.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hellaswag|10_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T12:19:01.767647.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T12:19:01.767647.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T12:19:01.767647.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T22_16_40.042920
path:
- '**/details_harness|winogrande|5_2023-09-16T22-16-40.042920.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T22-16-40.042920.parquet'
- config_name: results
data_files:
- split: 2023_08_09T12_19_01.767647
path:
- results_2023-08-09T12:19:01.767647.parquet
- split: 2023_09_16T22_16_40.042920
path:
- results_2023-09-16T22-16-40.042920.parquet
- split: latest
path:
- results_2023-09-16T22-16-40.042920.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-13b-v1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/YeungNLP/firefly-llama2-13b-v1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-llama2-13b-v1.2](https://huggingface.co/YeungNLP/firefly-llama2-13b-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-v1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T22:16:40.042920](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-13b-v1.2/blob/main/results_2023-09-16T22-16-40.042920.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1929530201342282,
"em_stderr": 0.004041241925899649,
"f1": 0.28937080536912874,
"f1_stderr": 0.004092108997164026,
"acc": 0.43286870958302937,
"acc_stderr": 0.010534410178374885
},
"harness|drop|3": {
"em": 0.1929530201342282,
"em_stderr": 0.004041241925899649,
"f1": 0.28937080536912874,
"f1_stderr": 0.004092108997164026
},
"harness|gsm8k|5": {
"acc": 0.11751326762699014,
"acc_stderr": 0.008870331256489991
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.01219848910025978
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tr416/dataset_20231006_234715 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73864
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_234715"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/eden_honkai3 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of eden (Houkai 3rd)
This is the dataset of eden (Houkai 3rd), containing 124 images and their tags.
The core tags of this character are `long_hair, bangs, breasts, yellow_eyes, purple_hair, hair_between_eyes, hair_ornament, large_breasts, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 124 | 206.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eden_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 124 | 108.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eden_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 296 | 218.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eden_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 124 | 175.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eden_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 296 | 315.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eden_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/eden_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, solo, cleavage, purple_dress, black_gloves, smile, closed_mouth, single_glove, chalice, holding_cup, sitting, single_earring |
| 1 | 6 |  |  |  |  |  | 1girl, :d, long_sleeves, looking_at_viewer, open_mouth, solo, black_gloves, single_glove, cleavage, purple_dress, simple_background, single_earring |
| 2 | 9 |  |  |  |  |  | 1girl, smile, solo, cleavage, looking_at_viewer, black_bikini, see-through, sunglasses, eyewear_on_head, navel, closed_mouth, outdoors, blue_sky, cloudy_sky, day, frills, holding, shorts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | looking_at_viewer | solo | cleavage | purple_dress | black_gloves | smile | closed_mouth | single_glove | chalice | holding_cup | sitting | single_earring | :d | open_mouth | simple_background | black_bikini | see-through | sunglasses | eyewear_on_head | navel | outdoors | blue_sky | cloudy_sky | day | frills | holding | shorts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:-------|:-----------|:---------------|:---------------|:--------|:---------------|:---------------|:----------|:--------------|:----------|:-----------------|:-----|:-------------|:--------------------|:---------------|:--------------|:-------------|:------------------|:--------|:-----------|:-----------|:-------------|:------|:---------|:----------|:---------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | | | | X | X | X | X | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | X | X | | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.