datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
andersonbcdefg/gpt4all | ---
license: other
---
|
0x7o/panorama | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 35059524.0
num_examples: 14079
download_size: 18789708
dataset_size: 35059524.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
h2oai/openassistant_oasst1_h2ogpt | ---
license: apache-2.0
language:
- en
thumbnail: https://h2o.ai/etc.clientlibs/h2o/clientlibs/clientlib-site/resources/images/favicon.ico
tags:
- gpt
- llm
- large language model
- open-source
---
# h2oGPT Data Card
## Summary
H2O.ai's `openassistant_oasst1_h2ogpt` is an open-source instruct-type dataset for fine-tuning of large language models, licensed for commercial use.
- Number of rows: `48307`
- Number of columns: `3`
- Column names: `['input', 'prompt_type', 'source']`
## Source
- [Original Open Assistant data in tree structure](https://huggingface.co/datasets/OpenAssistant/oasst1)
- [This flattened dataset created by script in h2oGPT repository](https://github.com/h2oai/h2ogpt/blob/83857fcf7d3b712aad5db32207e6db0ab0f780f9/create_data.py#L1252)
|
TREC-AToMiC/AToMiC-Texts-v0.2 | ---
dataset_info:
features:
- name: text_id
dtype: string
- name: page_url
dtype: string
- name: page_title
dtype: string
- name: section_title
dtype: string
- name: context_page_description
dtype: string
- name: context_section_description
dtype: string
- name: media
sequence: string
- name: hierachy
sequence: string
- name: category
sequence: string
- name: source_id
dtype: string
splits:
- name: train
num_bytes: 14378574060.336058
num_examples: 10134744
download_size: 6408012391
dataset_size: 14378574060.336058
license: cc-by-sa-4.0
size_categories:
- 100M<n<1B
---
# Dataset Card for "AToMiC-Texts-Mapped"
## Dataset Description
- **Homepage:** [AToMiC homepage](https://trec-atomic.github.io/)
- **Source:** [WIT](https://github.com/google-research-datasets/wit)
- **Paper:** [WIT: Wikipedia-based Image Text Dataset for Multimodal Multilingual Machine Learning](https://arxiv.org/abs/2103.01913)
### Languages
This dataset only contains English in Wikipedia (parsed from the 20221101 XML dump).
### Data Instances
Each instance is a section of a Wikipedia page. We also provide its page-level information, and associated information such as categories and media.
The `source_id` can be mapped back to the instance in the original [WIT instance](https://github.com/google-research-datasets/wit/blob/main/DATA.md).
Notice that the WIT dataset is crawled from the earlier version of Wikipedia (2020-08-30).
The WIT dataset is mapped to the new dump by pure BM25 matching with [Anserini](https://github.com/castorini/anserini).
### Intended Usage
1. Text collection for Image-to-Text retrieval
2. Language model pretraining
3. Document classification
### Licensing Information
[CC BY-SA 4.0 international license](https://creativecommons.org/licenses/by-sa/4.0/)
### Citation Information
TBA
### Acknowledgement
Thanks to:
[mwparserfromhell](https://github.com/earwig/mwparserfromhell)
[Datasets](https://github.com/huggingface/datasets)
[Anserini](https://github.com/castorini/anserini)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VickiCui/MORE | ---
license: cc-by-nc-4.0
---
|
mvkvc/artifact-100k | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ai
'1': real
splits:
- name: train
num_bytes: 1110613860.0
num_examples: 90000
- name: test
num_bytes: 128196890.0
num_examples: 10000
download_size: 1251405830
dataset_size: 1238810750.0
---
# Dataset Card for "artifact-100k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinGPT/fingpt-ner | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 241523
num_examples: 511
- name: test
num_bytes: 63634
num_examples: 98
download_size: 105426
dataset_size: 305157
---
# Dataset Card for "fingpt-ner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hda_nli_hindi | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- hi
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|hindi_discourse
task_categories:
- text-classification
task_ids:
- natural-language-inference
pretty_name: Hindi Discourse Analysis Dataset
dataset_info:
- config_name: HDA hindi nli
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': not-entailment
'1': entailment
- name: topic
dtype:
class_label:
names:
'0': Argumentative
'1': Descriptive
'2': Dialogic
'3': Informative
'4': Narrative
splits:
- name: train
num_bytes: 8721972
num_examples: 31892
- name: validation
num_bytes: 2556118
num_examples: 9460
- name: test
num_bytes: 2646453
num_examples: 9970
download_size: 13519261
dataset_size: 13924543
- config_name: hda nli hindi
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': not-entailment
'1': entailment
- name: topic
dtype:
class_label:
names:
'0': Argumentative
'1': Descriptive
'2': Dialogic
'3': Informative
'4': Narrative
splits:
- name: train
num_bytes: 8721972
num_examples: 31892
- name: validation
num_bytes: 2556118
num_examples: 9460
- name: test
num_bytes: 2646453
num_examples: 9970
download_size: 13519261
dataset_size: 13924543
---
# Dataset Card for Hindi Discourse Analysis Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **HomePage:** [GitHub](https://github.com/midas-research/hindi-nli-data)
- **Paper:** [Aclweb](https://www.aclweb.org/anthology/2020.aacl-main.71)
- **Point of Contact:** [GitHub](https://github.com/midas-research/hindi-nli-data)
### Dataset Summary
- Dataset for Natural Language Inference in Hindi Language. Hindi Discourse Analysis (HDA) Dataset consists of textual-entailment pairs.
- Each row of the Datasets if made up of 4 columns - Premise, Hypothesis, Label and Topic.
- Premise and Hypothesis is written in Hindi while Entailment_Label is in English.
- Entailment_label is of 2 types - entailed and not-entailed.
- Entailed means that hypotheis can be inferred from premise and not-entailed means vice versa
- Dataset can be used to train models for Natural Language Inference tasks in Hindi Language.
### Supported Tasks and Leaderboards
- Natural Language Inference for Hindi
### Languages
- Dataset is in Hindi
## Dataset Structure
- Data is structured in TSV format.
- train, test and dev files are in seperate files
### Dataset Instances
An example of 'train' looks as follows.
```
{'hypothesis': 'यह एक वर्णनात्मक कथन है।', 'label': 1, 'premise': 'जैसे उस का सारा चेहरा अपना हो और आँखें किसी दूसरे की जो चेहरे पर पपोटों के पीछे महसूर कर दी गईं।', 'topic': 1}
```
### Data Fields
Each row contatins 4 columns:
- premise: string
- hypothesis: string
- label: class label with values that correspond to "not-entailment" (0) or "entailment" (1)
- topic: class label with values that correspond to "Argumentative" (0), "Descriptive" (1), "Dialogic" (2), "Informative" (3) or "Narrative" (4).
### Data Splits
- Train : 31892
- Valid : 9460
- Test : 9970
## Dataset Creation
- We employ a recasting technique from Poliak et al. (2018a,b) to convert publicly available Hindi Discourse Analysis classification datasets in Hindi and pose them as TE problems
- In this recasting process, we build template hypotheses for each class in the label taxonomy
- Then, we pair the original annotated sentence with each of the template hypotheses to create TE samples.
- For more information on the recasting process, refer to paper https://www.aclweb.org/anthology/2020.aacl-main.71
### Source Data
Source Dataset for the recasting process is the BBC Hindi Headlines Dataset(https://github.com/NirantK/hindi2vec/releases/tag/bbc-hindi-v0.1)
#### Initial Data Collection and Normalization
- Initial Data was collected by members of MIDAS Lab from Hindi Websites. They crowd sourced the data annotation process and selected two random stories from our corpus and had the three annotators work on them independently and classify each sentence based on the discourse mode.
- Please refer to this paper for detailed information: https://www.aclweb.org/anthology/2020.lrec-1.149/
- The Discourse is further classified into "Argumentative" , "Descriptive" , "Dialogic" , "Informative" and "Narrative" - 5 Clases.
#### Who are the source language producers?
Please refer to this paper for detailed information: https://www.aclweb.org/anthology/2020.lrec-1.149/
### Annotations
#### Annotation process
Annotation process has been described in Dataset Creation Section.
#### Who are the annotators?
Annotation is done automatically by machine and corresponding recasting process.
### Personal and Sensitive Information
No Personal and Sensitive Information is mentioned in the Datasets.
## Considerations for Using the Data
Pls refer to this paper: https://www.aclweb.org/anthology/2020.aacl-main.71
### Discussion of Biases
No known bias exist in the dataset.
Pls refer to this paper: https://www.aclweb.org/anthology/2020.aacl-main.71
### Other Known Limitations
No other known limitations . Size of data may not be enough to train large models
## Additional Information
Pls refer to this link: https://github.com/midas-research/hindi-nli-data
### Dataset Curators
It is written in the repo : https://github.com/midas-research/hindi-nli-data that
- This corpus can be used freely for research purposes.
- The paper listed below provide details of the creation and use of the corpus. If you use the corpus, then please cite the paper.
- If interested in commercial use of the corpus, send email to midas@iiitd.ac.in.
- If you use the corpus in a product or application, then please credit the authors and Multimodal Digital Media Analysis Lab - Indraprastha Institute of Information Technology, New Delhi appropriately. Also, if you send us an email, we will be thrilled to know about how you have used the corpus.
- Multimodal Digital Media Analysis Lab - Indraprastha Institute of Information Technology, New Delhi, India disclaims any responsibility for the use of the corpus and does not provide technical support. However, the contact listed above will be happy to respond to queries and clarifications.
- Rather than redistributing the corpus, please direct interested parties to this page
- Please feel free to send us an email:
- with feedback regarding the corpus.
- with information on how you have used the corpus.
- if interested in having us analyze your data for natural language inference.
- if interested in a collaborative research project.
### Licensing Information
Copyright (C) 2019 Multimodal Digital Media Analysis Lab - Indraprastha Institute of Information Technology, New Delhi (MIDAS, IIIT-Delhi).
Pls contact authors for any information on the dataset.
### Citation Information
```
@inproceedings{uppal-etal-2020-two,
title = "Two-Step Classification using Recasted Data for Low Resource Settings",
author = "Uppal, Shagun and
Gupta, Vivek and
Swaminathan, Avinash and
Zhang, Haimin and
Mahata, Debanjan and
Gosangi, Rakesh and
Shah, Rajiv Ratn and
Stent, Amanda",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.aacl-main.71",
pages = "706--719",
abstract = "An NLP model{'}s ability to reason should be independent of language. Previous works utilize Natural Language Inference (NLI) to understand the reasoning ability of models, mostly focusing on high resource languages like English. To address scarcity of data in low-resource languages such as Hindi, we use data recasting to create NLI datasets for four existing text classification datasets. Through experiments, we show that our recasted dataset is devoid of statistical irregularities and spurious patterns. We further study the consistency in predictions of the textual entailment models and propose a consistency regulariser to remove pairwise-inconsistencies in predictions. We propose a novel two-step classification method which uses textual-entailment predictions for classification task. We further improve the performance by using a joint-objective for classification and textual entailment. We therefore highlight the benefits of data recasting and improvements on classification performance using our approach with supporting experimental results.",
}
```
### Contributions
Thanks to [@avinsit123](https://github.com/avinsit123) for adding this dataset. |
rntc/legacy_e3c | ---
dataset_info:
features:
- name: text
dtype: string
- name: tokens
sequence: string
- name: tokens_offsets
sequence:
sequence: int32
- name: clinical_entity_tags
sequence:
class_label:
names:
'0': O
'1': B-CLINENTITY
'2': I-CLINENTITY
- name: clinical_entity_cuid
sequence: string
- name: temporal_information_tags
sequence:
class_label:
names:
'0': O
'1': B-EVENT
'2': B-ACTOR
'3': B-BODYPART
'4': B-TIMEX3
'5': B-RML
'6': I-EVENT
'7': I-ACTOR
'8': I-BODYPART
'9': I-TIMEX3
'10': I-RML
splits:
- name: en.layer1
num_bytes: 1632165
num_examples: 1520
- name: en.layer2
num_bytes: 3263885
num_examples: 2873
- name: en.layer2.validation
num_bytes: 371196
num_examples: 334
- name: es.layer1
num_bytes: 1599169
num_examples: 1134
- name: es.layer2
num_bytes: 3192361
num_examples: 2347
- name: es.layer2.validation
num_bytes: 352193
num_examples: 261
- name: eu.layer1
num_bytes: 1931109
num_examples: 3126
- name: eu.layer2
num_bytes: 1066405
num_examples: 1594
- name: eu.layer2.validation
num_bytes: 279306
num_examples: 468
- name: fr.layer1
num_bytes: 1610663
num_examples: 1109
- name: fr.layer2
num_bytes: 3358033
num_examples: 2389
- name: fr.layer2.validation
num_bytes: 361816
num_examples: 293
- name: it.layer1
num_bytes: 1633613
num_examples: 1146
- name: it.layer2
num_bytes: 3373977
num_examples: 2436
- name: it.layer2.validation
num_bytes: 366932
num_examples: 275
download_size: 4803032
dataset_size: 24392823
configs:
- config_name: default
data_files:
- split: en.layer1
path: data/en.layer1-*
- split: en.layer2
path: data/en.layer2-*
- split: en.layer2.validation
path: data/en.layer2.validation-*
- split: es.layer1
path: data/es.layer1-*
- split: es.layer2
path: data/es.layer2-*
- split: es.layer2.validation
path: data/es.layer2.validation-*
- split: eu.layer1
path: data/eu.layer1-*
- split: eu.layer2
path: data/eu.layer2-*
- split: eu.layer2.validation
path: data/eu.layer2.validation-*
- split: fr.layer1
path: data/fr.layer1-*
- split: fr.layer2
path: data/fr.layer2-*
- split: fr.layer2.validation
path: data/fr.layer2.validation-*
- split: it.layer1
path: data/it.layer1-*
- split: it.layer2
path: data/it.layer2-*
- split: it.layer2.validation
path: data/it.layer2.validation-*
---
|
arazd/tulu_self_instruct | ---
license: openrail
---
|
timaeus/dsir-pile-10m | ---
license: mit
---
|
p1atdev/ichikara-instruction | ---
dataset_info:
- config_name: 20231115-1
features:
- name: ID
dtype: string
- name: text
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2007875
num_examples: 1729
download_size: 1148243
dataset_size: 2007875
- config_name: 20231115-2
features:
- name: ID
dtype: string
- name: text
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 341973
num_examples: 316
download_size: 179947
dataset_size: 341973
- config_name: 20231115-5
features:
- name: ID
dtype: string
- name: text
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 976579
num_examples: 858
download_size: 434425
dataset_size: 976579
- config_name: 20231221-002
features:
- name: ID
dtype: string
- name: text
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3018531
num_examples: 1899
download_size: 1633772
dataset_size: 3018531
- config_name: 20231221-003
features:
- name: ID
dtype: string
- name: text
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3018541
num_examples: 1899
download_size: 1633766
dataset_size: 3018541
configs:
- config_name: 20231115-1
data_files:
- split: train
path: 20231115-1/train-*
- config_name: 20231115-2
data_files:
- split: train
path: 20231115-2/train-*
- config_name: 20231115-5
data_files:
- split: train
path: 20231115-5/train-*
- config_name: 20231221-002
data_files:
- split: train
path: 20231221-002/train-*
- config_name: 20231221-003
data_files:
- split: train
path: 20231221-003/train-*
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- ja
pretty_name: ichikara-instruction
size_categories:
- 1K<n<10K
---
## ichikara-instruction (Non Commercial)
[LLMのための日本語インストラクションデータ 公開ページ](https://liat-aip.sakura.ne.jp/wp/llm%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E6%97%A5%E6%9C%AC%E8%AA%9E%E3%82%A4%E3%83%B3%E3%82%B9%E3%83%88%E3%83%A9%E3%82%AF%E3%82%B7%E3%83%A7%E3%83%B3%E3%83%87%E3%83%BC%E3%82%BF%E4%BD%9C%E6%88%90/llm%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E6%97%A5%E6%9C%AC%E8%AA%9E%E3%82%A4%E3%83%B3%E3%82%B9%E3%83%88%E3%83%A9%E3%82%AF%E3%82%B7%E3%83%A7%E3%83%B3%E3%83%87%E3%83%BC%E3%82%BF-%E5%85%AC%E9%96%8B/)
公開ページより、
> 本データに関して、言語処理学会第30回年次大会において発表を行います。データを使われた方は、HPと共に下記の通りにお願いします。
>
> 関根聡, 安藤まや, 後藤美知子, 鈴木久美, 河原大輔, 井之上直也, 乾健太郎. ichikara-instruction: LLMのための日本語インストラクションデータの構築. 言語処理学会第30回年次大会(2024)
論文: https://www.anlp.jp/proceedings/annual_meeting/2024/pdf_dir/A6-3.pdf
|
olly4/cities-suburbs-small | ---
dataset_info:
features:
- name: image
dtype: image
- name: 'Unnamed: 0'
dtype: int64
- name: description
dtype: string
splits:
- name: train
num_bytes: 872929816.432
num_examples: 2202
download_size: 428529931
dataset_size: 872929816.432
---
# Dataset Card for "cities-suburbs-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/DTD_parition1_test_facebook_opt_1.3b_Attributes_Caption_ns_1880_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 92259967.0
num_examples: 1880
- name: fewshot_3_bs_16
num_bytes: 93272711.0
num_examples: 1880
download_size: 91287196
dataset_size: 185532678.0
---
# Dataset Card for "DTD_parition1_test_facebook_opt_1.3b_Attributes_Caption_ns_1880_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
divyapatel4/Microsoft-PeNS | ---
license: ms-pl
---
|
jayhii/top_50_dataset | ---
license: mit
---
|
Tk108263/Tk | ---
license: apache-2.0
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/0f9134d7 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1343
dataset_size: 184
---
# Dataset Card for "0f9134d7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Leul78/persona | ---
license: apache-2.0
---
|
AY000554/Car_plate_OCR_dataset | ---
language:
- ru
tags:
- computer vision
- OCR
- car plate
- Russian car plate recognition
- Nomeroff Net
- AUTO.RIA
size_categories:
- 10K<n<100K
---
# Russian car plate recognition dataset
Car_plate_OCR_dataset - это набор данных из примерно 45,5К изображений российских номеров автомобилей одного типа (рисунок 1) и их разметки в виде текста. Этот набор данных предназначен для обучения нейронных сетей распознаванию номера автомобиля по изображению номера.
Основан на датасете из проекта [Nomeroff Net](https://nomeroff.net.ua/#). По сравнению с оригинальным набором данных были удалены некоторые изображения не соответствующие формату разметки (которые имели иное имя файла, не являющееся содержанием номера).
||
|:-----:|
|Рисунок 1 - Пример номера автомобиля|
Данные разбиты на подвыборки для обучения, тестирования и валидации:
|Типп выборки данных | Количество изображений |
| :----------------: | :--------------------: |
| train | 37775 (83%) |
| val | 4891 (10,7%) |
| test | 2845 (6.3%) |
| all images | 45514 |
В качестве разметки используется имя изображения номера, в котором записан сам номер в виде латинских заглавных букв и цифр.
Примеры изображений номеров и их разметки:
| <br> A129XY196 | <br> K211PA69 |
| :------------------------------------------: | :------------------------------------------: |
| <br> E353TA46 | <br> P895HE96 |
Алфавит символов: ```1234567890ABEKMHOPCTYX```
Пример использования данного датасета приведён в проекте [ocr_car_plate](https://github.com/AY000554/ocr_car_plate/tree/main).
# Лицензия
Оригинальный датасет распространяется под лицензией CC BY 4.0. Подробнее в файле license.txt. |
one-sec-cv12/chunk_180 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 18216943920.875
num_examples: 189665
download_size: 16530141980
dataset_size: 18216943920.875
---
# Dataset Card for "chunk_180"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_find_passage_train10_eval10_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2645
num_examples: 30
- name: validation
num_bytes: 1151
num_examples: 10
download_size: 5413
dataset_size: 3796
---
# Dataset Card for "random_letter_find_passage_train10_eval10_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
charlieoneill/ttt_resid_streams | ---
dataset_info:
features:
- name: data
sequence:
sequence: float32
splits:
- name: train
num_bytes: 1212558352
num_examples: 4
download_size: 603150637
dataset_size: 1212558352
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/kamiya_nao_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kamiya_nao/神谷奈緒/카미야나오 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kamiya_nao/神谷奈緒/카미야나오 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `brown_hair, long_hair, red_eyes, bangs, blunt_bangs, thick_eyebrows, breasts, hair_bun, single_hair_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 691.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamiya_nao_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 396.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamiya_nao_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1225 | 846.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamiya_nao_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 609.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamiya_nao_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1225 | 1.18 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kamiya_nao_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kamiya_nao_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, smile, solo, looking_at_viewer, belt, blush, earrings, navel, white_shorts, coat, midriff, open_mouth, bow, frills, hair_ornament, long_sleeves, short_shorts, white_background, black_thighhighs, holding_microphone, idol |
| 1 | 39 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, simple_background, white_background, white_shirt, school_uniform, blue_necktie, braid, long_sleeves, striped_necktie, plaid_skirt, pleated_skirt, upper_body, blue_jacket, smile |
| 2 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, open_mouth, smile, solo, blush, hair_flower, fingerless_gloves, thighhighs, skirt, microphone |
| 3 | 10 |  |  |  |  |  | 1girl, elbow_gloves, midriff, skirt, solo, smile, belt, navel, hairband, microphone, open_mouth, black_gloves, blush, looking_at_viewer |
| 4 | 11 |  |  |  |  |  | 1girl, blush, nipples, solo, looking_at_viewer, female_pubic_hair, medium_breasts, navel, large_breasts, completely_nude, sitting, sweat |
| 5 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, open_mouth, solo, wet_shirt, bracelet, see-through, simple_background, white_background, white_shirt, bikini_skirt, low_twintails, navel, purple_bikini, short_sleeves, bikini_under_clothes, cowboy_shot, shirt_lift, smile |
| 6 | 18 |  |  |  |  |  | 1girl, maid_headdress, blush, enmaided, solo, looking_at_viewer, frills, wrist_cuffs, maid_apron, thighhighs, bow, open_mouth, puffy_sleeves, short_sleeves |
| 7 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, blush, cleavage, collarbone, large_breasts, open_mouth, thighs, black_bikini, elbow_gloves, simple_background, white_background, bare_shoulders, black_gloves, black_thighhighs, micro_bikini, side-tie_bikini_bottom, black_choker |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | looking_at_viewer | belt | blush | earrings | navel | white_shorts | coat | midriff | open_mouth | bow | frills | hair_ornament | long_sleeves | short_shorts | white_background | black_thighhighs | holding_microphone | idol | simple_background | white_shirt | school_uniform | blue_necktie | braid | striped_necktie | plaid_skirt | pleated_skirt | upper_body | blue_jacket | hair_flower | fingerless_gloves | thighhighs | skirt | microphone | elbow_gloves | hairband | black_gloves | nipples | female_pubic_hair | medium_breasts | large_breasts | completely_nude | sitting | sweat | wet_shirt | bracelet | see-through | bikini_skirt | low_twintails | purple_bikini | short_sleeves | bikini_under_clothes | cowboy_shot | shirt_lift | maid_headdress | enmaided | wrist_cuffs | maid_apron | puffy_sleeves | cleavage | collarbone | thighs | black_bikini | bare_shoulders | micro_bikini | side-tie_bikini_bottom | black_choker |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:-------|:--------|:-----------|:--------|:---------------|:-------|:----------|:-------------|:------|:---------|:----------------|:---------------|:---------------|:-------------------|:-------------------|:---------------------|:-------|:--------------------|:--------------|:-----------------|:---------------|:--------|:------------------|:--------------|:----------------|:-------------|:--------------|:--------------|:--------------------|:-------------|:--------|:-------------|:---------------|:-----------|:---------------|:----------|:--------------------|:-----------------|:----------------|:------------------|:----------|:--------|:------------|:-----------|:--------------|:---------------|:----------------|:----------------|:----------------|:-----------------------|:--------------|:-------------|:-----------------|:-----------|:--------------|:-------------|:----------------|:-----------|:-------------|:---------|:---------------|:-----------------|:---------------|:-------------------------|:---------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 39 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | | X | | X | | | | X | | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 6 | 18 |  |  |  |  |  | X | | X | X | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | X | X | | X | | X | | | | X | | | | | | X | X | | | X | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_rte_double_modals | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 341284
num_examples: 759
- name: train
num_bytes: 296520
num_examples: 658
download_size: 411372
dataset_size: 637804
---
# Dataset Card for "MULTI_VALUE_rte_double_modals"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alka-1/Layla-jp | ---
license: mit
---
|
CyberHarem/lisa_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lisa/リサ/丽莎 (Genshin Impact)
This is the dataset of lisa/リサ/丽莎 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `long_hair, breasts, brown_hair, green_eyes, large_breasts, hat, hair_ornament, witch_hat, purple_headwear, hair_between_eyes, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1022.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisa_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 853.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisa_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1341 | 1.73 GiB | [Download](https://huggingface.co/datasets/CyberHarem/lisa_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lisa_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, green_headwear, official_alternate_costume, solo, vision_(genshin_impact), looking_at_viewer, cleavage, smile, twin_braids, puffy_long_sleeves, dress, thighlet, beret, purple_rose, parted_lips, neck_ring, thighs, holding_book |
| 1 | 8 |  |  |  |  |  | 1girl, black_gloves, black_thighhighs, cleavage, dress, hat_flower, holding_book, looking_at_viewer, smile, solo, vision_(genshin_impact), witch, jewelry, hat_belt, purple_capelet, purple_rose |
| 2 | 15 |  |  |  |  |  | 1girl, black_gloves, cleavage, solo, dress, looking_at_viewer, smile, purple_rose, hat_flower, jewelry, upper_body, parted_lips, vision_(genshin_impact), witch |
| 3 | 5 |  |  |  |  |  | 1girl, blush, collarbone, looking_at_viewer, outdoors, parted_lips, patreon_username, solo, wet, navel, nipples, stomach, cleavage, cloud, completely_nude, day, hair_over_shoulder, rock, thighs, water, artist_name, bare_shoulders, beach, blue_sky, grin, ocean, patreon_logo, petals, purple_rose, pussy, shore, tree, upper_body |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, blue_leotard, blush, cosplay, elbow_gloves, highleg_leotard, thighs, black_pantyhose, bodystocking, cleavage, covered_navel, detached_sleeves, gold_trim, parted_lips, solo, thighlet, blue_headwear, choker, collarbone, looking_at_viewer, bookshelf, hat_ornament, purple_leotard, rose, sitting, smile |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, blush, dark-skinned_male, erection, heart-shaped_pupils, hetero, indoors, interracial, large_penis, solo_focus, uncensored, veiny_penis, rose, cleavage, dark_penis, huge_penis, open_mouth, purple_bra, sweat, blurry_background, collarbone, cum, hair_over_shoulder, half-closed_eyes, licking_penis, looking_at_viewer, nude, pov, saliva, tongue_out, very_dark_skin |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | green_headwear | official_alternate_costume | solo | vision_(genshin_impact) | looking_at_viewer | cleavage | smile | twin_braids | puffy_long_sleeves | dress | thighlet | beret | purple_rose | parted_lips | neck_ring | thighs | holding_book | black_gloves | black_thighhighs | hat_flower | witch | jewelry | hat_belt | purple_capelet | upper_body | blush | collarbone | outdoors | patreon_username | wet | navel | nipples | stomach | cloud | completely_nude | day | hair_over_shoulder | rock | water | artist_name | bare_shoulders | beach | blue_sky | grin | ocean | patreon_logo | petals | pussy | shore | tree | blue_leotard | cosplay | elbow_gloves | highleg_leotard | black_pantyhose | bodystocking | covered_navel | detached_sleeves | gold_trim | blue_headwear | choker | bookshelf | hat_ornament | purple_leotard | rose | sitting | 1boy | dark-skinned_male | erection | heart-shaped_pupils | hetero | indoors | interracial | large_penis | solo_focus | uncensored | veiny_penis | dark_penis | huge_penis | open_mouth | purple_bra | sweat | blurry_background | cum | half-closed_eyes | licking_penis | nude | pov | saliva | tongue_out | very_dark_skin |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------------------------|:-------|:--------------------------|:--------------------|:-----------|:--------|:--------------|:---------------------|:--------|:-----------|:--------|:--------------|:--------------|:------------|:---------|:---------------|:---------------|:-------------------|:-------------|:--------|:----------|:-----------|:-----------------|:-------------|:--------|:-------------|:-----------|:-------------------|:------|:--------|:----------|:----------|:--------|:------------------|:------|:---------------------|:-------|:--------|:--------------|:-----------------|:--------|:-----------|:-------|:--------|:---------------|:---------|:--------|:--------|:-------|:---------------|:----------|:---------------|:------------------|:------------------|:---------------|:----------------|:-------------------|:------------|:----------------|:---------|:------------|:---------------|:-----------------|:-------|:----------|:-------|:--------------------|:-----------|:----------------------|:---------|:----------|:--------------|:--------------|:-------------|:-------------|:--------------|:-------------|:-------------|:-------------|:-------------|:--------|:--------------------|:------|:-------------------|:----------------|:-------|:------|:---------|:-------------|:-----------------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | | X | X | X | X | X | | | X | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | | | X | X | X | X | X | | | X | | | X | X | | | | X | | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | X | X | | | | | | | X | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | | X | X | X | | | | X | | | X | | X | | X | | | | | | | | X | X | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
nihalbaig/alpaca-bangla | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: 'null'
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: vectors
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
dtype: 'null'
splits:
- name: train
num_bytes: 36188108
num_examples: 18000
download_size: 13437852
dataset_size: 36188108
---
# Dataset Card for "alpaca-bangla"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KabilanM/plant-label-classification | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
sequence:
- name: bbox
sequence: float32
length: 4
- name: categories
dtype:
class_label:
names:
'0': Old Label
'1': New Label
splits:
- name: train
num_bytes: 831609383.0
num_examples: 15
download_size: 831411231
dataset_size: 831609383.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "plant-label-classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_45 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 22818315456.5
num_examples: 237572
download_size: 20314789086
dataset_size: 22818315456.5
---
# Dataset Card for "chunk_45"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CalamityChain/FineTuningSD | ---
license: afl-3.0
---
|
akkasi/metooma | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: TweetId
dtype: string
- name: labels
sequence: float64
- name: label2idx
dtype: string
- name: idx2label
dtype: string
splits:
- name: train
num_bytes: 2991750
num_examples: 7978
- name: test
num_bytes: 748125
num_examples: 1995
download_size: 195958
dataset_size: 3739875
---
# Dataset Card for "metooma_new"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_drop_copula_be_NP | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 986671
num_examples: 6782
- name: test
num_bytes: 9987748
num_examples: 67911
- name: train
num_bytes: 8876057
num_examples: 61027
download_size: 11817070
dataset_size: 19850476
---
# Dataset Card for "MULTI_VALUE_qqp_drop_copula_be_NP"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alpindale/visual-novels | ---
license: apache-2.0
task_categories:
- conversational
- text-generation
language:
- en
pretty_name: Visual Novels
---
# Visual Novel Dataset
This dataset contains parsed Visual Novel scripts for training language models. The dataset consists of approximately 60 million tokens of parsed scripts.
## Dataset Structure
The dataset follows a general structure for visual novel scripts:
- Dialogue lines: Dialogue lines are formatted with the speaker's name followed by a colon, and the dialogue itself enclosed in quotes. For example:
```
John: "Hello, how are you?"
```
- Actions and narration: Actions and narration within the Visual Novel scripts are often enclosed in asterisks, but it's important to note that not all visual novels follow this convention. Actions and narration provide descriptions of character movements, background settings, or other narrative elements.
```
*John looked around the room, searching for answers.*
```
## Contents
- `visual-novels.txt`: This file contains all the parsed VNs concatenated within a single plaintext file. Each entry is separated with this string:
```
[ - title - {visual-novel-title-1.txt} ]
```
- `VNDB/`: This directory contains `.json` files that contain VNDB IDs for the corresponding VN's characters. Does not include unparsed VNs.
- `Archives/visual-novels-parsed.tar.zst`: This archive contains the parsed VNs but with each script in a separate text file (i.e. not concatenated).
- `Archives/visual-novels-unparsed.tar.zst`: This archive contains all the unparsed VNs along with the original script for the currently parsed VNs.
## Usage
You can utilize this dataset to train language models, particularly for tasks related to natural language processing and text generation. By leveraging the parsed visual novel scripts, you can train models to understand dialogue structures and generate coherent responses. Additionally, the inclusion of the unparsed scripts allows for further analysis and processing.
## Contribution
This dataset was gathered and parsed by the [PygmalionAI](https://hugginface.co/PygmalionAI) Data Processing Team. Listed below are the team members, sorted by contribution amount:
- **Suikamelon**: [HuggingFace](https://huggingface.co/lemonilia) - (2,787,704 ++ 672,473 --)
- **Alpin**: [HuggingFace](https://huggingface.co/alpindale) - [GitHub](https://github.com/AlpinDale) (1,170,985 ++ 345,120 --)
- **Spartan**: [GitHub](https://github.com/Spartan9772) (901,046 ++ 467,915 --)
- **Unlucky-AI** [GitHub](https://github.com/Unlucky-AI) (253,316 ++ 256 --)
## Citation
If you use this dataset in your research or projects, please cite it appropriately.
## Acknowledgements
This dataset is compiled and shared for research and educational purposes. The dataset includes parsed visual novel scripts from various sources, which are predominantly copyrighted and owned by their respective publishers and creators. The inclusion of these scripts in this dataset does not imply any endorsement or authorization from the copyright holders.
We would like to express our sincere gratitude to the original copyright holders and creators of the visual novels for their valuable contributions to the art and storytelling. We respect and acknowledge their intellectual property rights.
We strongly encourage users of this dataset to adhere to copyright laws and any applicable licensing restrictions when using or analyzing the provided content. It is the responsibility of the users to ensure that any use of the dataset complies with the legal requirements governing intellectual property and fair use.
Please be aware that the creators and distributors of this dataset disclaim any liability or responsibility for any unauthorized or illegal use of the dataset by third parties.
If you are a copyright holder or have any concerns about the content included in this dataset, please contact us at [this email address](mailto:alpin@alpindale.dev) to discuss the matter further and address any potential issues.
|
minh21/COVID-QA-question-answering-biencoder-data-65_25_10 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: context_chunks
sequence: string
- name: document_id
dtype: int64
- name: id
dtype: int64
splits:
- name: train
num_bytes: 55383294
num_examples: 1170
- name: validation
num_bytes: 5172033
num_examples: 140
download_size: 16954453
dataset_size: 60555327
---
# Dataset Card for "COVID-QA-question-answering-biencoder-data-65_25_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sorbhet/llamakrity | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6965366
num_examples: 10000
download_size: 3780553
dataset_size: 6965366
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
k0ntra/tehran | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
splits:
- name: train
num_bytes: 113664
num_examples: 37
download_size: 453002
dataset_size: 113664
---
# Dataset Card for "tehran"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Pampkinus/Alexander-Lukashenko | ---
license: openrail
---
A faceset of the Belorussian president Alexander Lukashenko , 33910 images (jpg)
https://en.wikipedia.org/wiki/Alexander_Lukashenko |
roupenminassian/vehicle-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: int64
- name: width
dtype: int64
- name: height
dtype: int64
- name: objects
struct:
- name: id
sequence: int64
- name: area
sequence: float64
- name: bbox
sequence:
sequence: float64
- name: category
sequence: int64
splits:
- name: train
num_bytes: 74749784.0
num_examples: 618
download_size: 74708626
dataset_size: 74749784.0
---
# Dataset Card for "vehicle-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eihli/micro-ok-vqa | ---
dataset_info:
features:
- name: image
dtype: image
- name: question_type
dtype: string
- name: confidence
dtype: int32
- name: answers
list:
- name: answer
dtype: string
- name: raw_answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: image_id
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
splits:
- name: train
num_bytes: 12974143.0
num_examples: 80
- name: validation
num_bytes: 3538286.0
num_examples: 20
download_size: 16437576
dataset_size: 16512429.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
unk1911/ddpm-butterflies-128 | ---
license: apache-2.0
---
|
HimuraZ/Ashe | ---
license: openrail
---
|
ixelszy/DaikiKase_Lora | ---
license: afl-3.0
task_categories:
- image-classification
language:
- en
tags:
- art
- not-for-all-audiences
- nsfw
- lora
pretty_name: DaikiKase
size_categories:
- 1K<n<10K
source_datasets:
- 加瀬大輝(DaikiKase) Pixiv
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 3576319370.552
num_examples: 2668
download_size: 3586311849
dataset_size: 3576319370.552
---
|
OneFly7/llama2-politosphere-fine-tuning-system-prompt_with_definition | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 184692
num_examples: 113
- name: validation
num_bytes: 182440
num_examples: 113
download_size: 66387
dataset_size: 367132
---
# Dataset Card for "llama2-politosphere-fine-tuning-system-prompt_with_definition"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johnny46/Surah-Baqarah | ---
license: openrail
---
|
vg055/RestMex2023_review-corpus_DataAugV1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 121676941
num_examples: 332823
download_size: 74199966
dataset_size: 121676941
---
# Dataset Card for "RestMex2023_review-corpus_DataAugV1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nick-carroll1/lyrics_dataset | ---
dataset_info:
features:
- name: Artist
dtype: string
- name: Song
dtype: string
- name: Lyrics
dtype: string
splits:
- name: train
num_bytes: 371464
num_examples: 237
download_size: 166829
dataset_size: 371464
---
# Dataset Card for "lyrics_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Locutusque/ColumnedChatCombined | ---
license: openrail
task_categories:
- conversational
- question-answering
- text-generation
language:
- en
- zh
size_categories:
- 1M<n<10M
---
## This dataset is a version of the ChatCombined dataset where each token is separated into three different columns.
These three columns are:
- "System" - a string with a system prompt
- "User" - a string with user input
- "Assistant" - a string containing the model output
# You can load the dataset like this
```python
with open("formatted_data.json") as f:
data = json.load(f)
val_data = data["validation"]
data = data["train"]
```
### Example usage
```python
def __getitem__(self, idx):
system = self.data[idx]["System"].strip('\n')
user = self.data[idx]["User"].strip('\n')
assistant = self.data[idx]["Assistant"].strip('\n')
return system, user, assistant
```
## Citations
```
@misc{huggingface2023,
title={dmayhem93/ChatCombined},
author={{dmayhem93}},
year=2023,
url="https://huggingface.co/datasets/dmayhem93/ChatCombined"
}
``` |
robertmyers/prompting-rm-gpt4 | ---
license: mit
---
|
irds/beir_dbpedia-entity | ---
pretty_name: '`beir/dbpedia-entity`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `beir/dbpedia-entity`
The `beir/dbpedia-entity` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/dbpedia-entity).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=4,635,922
- `queries` (i.e., topics); count=467
This dataset is used by: [`beir_dbpedia-entity_dev`](https://huggingface.co/datasets/irds/beir_dbpedia-entity_dev), [`beir_dbpedia-entity_test`](https://huggingface.co/datasets/irds/beir_dbpedia-entity_test)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/beir_dbpedia-entity', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ..., 'title': ..., 'url': ...}
queries = load_dataset('irds/beir_dbpedia-entity', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Hasibi2017DBpediaEntityVA,
title={DBpedia-Entity v2: A Test Collection for Entity Search},
author={Faegheh Hasibi and Fedor Nikolaev and Chenyan Xiong and K. Balog and S. E. Bratsberg and Alexander Kotov and J. Callan},
journal={Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval},
year={2017}
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV3.075-Vision-7B | ---
pretty_name: Evaluation run of Nitral-AI/Eris_PrimeV3.075-Vision-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Nitral-AI/Eris_PrimeV3.075-Vision-7B](https://huggingface.co/Nitral-AI/Eris_PrimeV3.075-Vision-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV3.075-Vision-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T15:10:24.752447](https://huggingface.co/datasets/open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV3.075-Vision-7B/blob/main/results_2024-03-24T15-10-24.752447.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522395840846923,\n\
\ \"acc_stderr\": 0.032126555302262896,\n \"acc_norm\": 0.6532177215110839,\n\
\ \"acc_norm_stderr\": 0.03278002631705583,\n \"mc1\": 0.4504283965728274,\n\
\ \"mc1_stderr\": 0.017417264371967642,\n \"mc2\": 0.627156475868207,\n\
\ \"mc2_stderr\": 0.015153151941834196\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6578498293515358,\n \"acc_stderr\": 0.013864152159177278,\n\
\ \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6816371240788688,\n\
\ \"acc_stderr\": 0.0046488907875817,\n \"acc_norm\": 0.8643696474805815,\n\
\ \"acc_norm_stderr\": 0.0034169585913247946\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"\
acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.40370370370370373,\n \"acc_stderr\": 0.029914812342227624,\n \
\ \"acc_norm\": 0.40370370370370373,\n \"acc_norm_stderr\": 0.029914812342227624\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010333,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768424,\n \
\ \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768424\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.016115235504865478,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.016115235504865478\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826517,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826517\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274054,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274054\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083376,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4504283965728274,\n\
\ \"mc1_stderr\": 0.017417264371967642,\n \"mc2\": 0.627156475868207,\n\
\ \"mc2_stderr\": 0.015153151941834196\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989245\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.643669446550417,\n \
\ \"acc_stderr\": 0.013191685031357463\n }\n}\n```"
repo_url: https://huggingface.co/Nitral-AI/Eris_PrimeV3.075-Vision-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-10-24.752447.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-10-24.752447.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- '**/details_harness|winogrande|5_2024-03-24T15-10-24.752447.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T15-10-24.752447.parquet'
- config_name: results
data_files:
- split: 2024_03_24T15_10_24.752447
path:
- results_2024-03-24T15-10-24.752447.parquet
- split: latest
path:
- results_2024-03-24T15-10-24.752447.parquet
---
# Dataset Card for Evaluation run of Nitral-AI/Eris_PrimeV3.075-Vision-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Nitral-AI/Eris_PrimeV3.075-Vision-7B](https://huggingface.co/Nitral-AI/Eris_PrimeV3.075-Vision-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV3.075-Vision-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T15:10:24.752447](https://huggingface.co/datasets/open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV3.075-Vision-7B/blob/main/results_2024-03-24T15-10-24.752447.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522395840846923,
"acc_stderr": 0.032126555302262896,
"acc_norm": 0.6532177215110839,
"acc_norm_stderr": 0.03278002631705583,
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967642,
"mc2": 0.627156475868207,
"mc2_stderr": 0.015153151941834196
},
"harness|arc:challenge|25": {
"acc": 0.6578498293515358,
"acc_stderr": 0.013864152159177278,
"acc_norm": 0.6825938566552902,
"acc_norm_stderr": 0.013602239088038167
},
"harness|hellaswag|10": {
"acc": 0.6816371240788688,
"acc_stderr": 0.0046488907875817,
"acc_norm": 0.8643696474805815,
"acc_norm_stderr": 0.0034169585913247946
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.40370370370370373,
"acc_stderr": 0.029914812342227624,
"acc_norm": 0.40370370370370373,
"acc_norm_stderr": 0.029914812342227624
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010333,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768424,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768424
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865478,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865478
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826517,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826517
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274054,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274054
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083376,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967642,
"mc2": 0.627156475868207,
"mc2_stderr": 0.015153151941834196
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989245
},
"harness|gsm8k|5": {
"acc": 0.643669446550417,
"acc_stderr": 0.013191685031357463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BangumiBase/fruitsbasket | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Fruits Basket
This is the image base of bangumi Fruits Basket, we detected 59 characters, 6849 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 886 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 223 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 210 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 72 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 51 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 118 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 542 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 125 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 73 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 74 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 80 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 29 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 24 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 115 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 755 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 30 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 42 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 62 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 66 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 41 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 78 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 76 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 1036 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 118 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 60 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 40 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 27 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 37 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 26 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 29 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 20 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 41 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 10 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 15 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 32 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 9 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 11 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 210 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 68 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 15 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 106 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 29 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 19 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 18 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 33 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 50 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 221 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 52 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 21 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 241 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 113 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 19 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 23 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 34 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 46 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 8 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 22 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 13 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 205 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Anthropic/persuasion | ---
license: cc-by-nc-sa-4.0
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for Persuasion Dataset
## Dataset Summary
The Persuasion Dataset contains claims and corresponding human-written and model-generated arguments, along with persuasiveness scores.
This dataset was created for research on measuring the persuasiveness of language models, as described in this blog post: [Measuring the Persuasiveness of Language Models](https://www.anthropic.com/news/measuring-model-persuasiveness).
## Dataset Description
The dataset consists of a CSV file with the following columns:
- **worker\_id**: Id of the participant who annotated their initial and final stance on the claim.
- **claim**: The claim for which the argument was generated.
- **argument**: The generated argument, either by a human or a language model.
- **source**: The source of the argument (model name or "Human").
- **prompt\_type**: The prompt type used to generate the argument.
- **rating\_initial**: The participant's initial rating of the claim.
- **rating\_final**: The participant's final rating of the claim after reading the argument.
## Usage
```python
from datasets import load_dataset
# Loading the data
dataset = load_dataset("Anthropic/persuasion")
```
## Contact
For questions, you can email esin at anthropic dot com
## Citation
If you would like to cite our work or data, you may use the following bibtex citation:
```
@online{durmus2024persuasion,
author = {Esin Durmus and Liane Lovitt and Alex Tamkin and Stuart Ritchie and Jack Clark and Deep Ganguli},
title = {Measuring the Persuasiveness of Language Models},
date = {2024-04-09},
year = {2024},
url = {https://www.anthropic.com/news/measuring-model-persuasiveness},
}
```
|
open-llm-leaderboard/details_llmixer__BigWeave-v16-103b | ---
pretty_name: Evaluation run of llmixer/BigWeave-v16-103b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llmixer/BigWeave-v16-103b](https://huggingface.co/llmixer/BigWeave-v16-103b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llmixer__BigWeave-v16-103b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T07:02:03.874032](https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v16-103b/blob/main/results_2024-02-10T07-02-03.874032.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7291217373860504,\n\
\ \"acc_stderr\": 0.029814128118071586,\n \"acc_norm\": 0.7334267277522604,\n\
\ \"acc_norm_stderr\": 0.030381307938227346,\n \"mc1\": 0.4785801713586291,\n\
\ \"mc1_stderr\": 0.017487432144711806,\n \"mc2\": 0.6380949314219707,\n\
\ \"mc2_stderr\": 0.015121732490251848\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407156,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.01385583128749773\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6992630950009958,\n\
\ \"acc_stderr\": 0.0045764127139515,\n \"acc_norm\": 0.8761202947619996,\n\
\ \"acc_norm_stderr\": 0.003287709741128796\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930405,\n\
\ \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930405\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7584905660377359,\n \"acc_stderr\": 0.026341480371118352,\n\
\ \"acc_norm\": 0.7584905660377359,\n \"acc_norm_stderr\": 0.026341480371118352\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8819444444444444,\n\
\ \"acc_stderr\": 0.026983346503309358,\n \"acc_norm\": 0.8819444444444444,\n\
\ \"acc_norm_stderr\": 0.026983346503309358\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n\
\ \"acc_stderr\": 0.03391750322321657,\n \"acc_norm\": 0.7283236994219653,\n\
\ \"acc_norm_stderr\": 0.03391750322321657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7276595744680852,\n \"acc_stderr\": 0.0291012906983867,\n\
\ \"acc_norm\": 0.7276595744680852,\n \"acc_norm_stderr\": 0.0291012906983867\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5608465608465608,\n \"acc_stderr\": 0.025559920550531013,\n \"\
acc_norm\": 0.5608465608465608,\n \"acc_norm_stderr\": 0.025559920550531013\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.02173254068932928,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.02173254068932928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6157635467980296,\n \"acc_stderr\": 0.034223985656575515,\n\
\ \"acc_norm\": 0.6157635467980296,\n \"acc_norm_stderr\": 0.034223985656575515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865383,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865383\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424208,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424208\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.01932180555722317,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.01932180555722317\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7615384615384615,\n \"acc_stderr\": 0.02160629449464773,\n \
\ \"acc_norm\": 0.7615384615384615,\n \"acc_norm_stderr\": 0.02160629449464773\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4222222222222222,\n \"acc_stderr\": 0.03011444201966809,\n \
\ \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.03011444201966809\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8403361344537815,\n \"acc_stderr\": 0.0237933539975288,\n \
\ \"acc_norm\": 0.8403361344537815,\n \"acc_norm_stderr\": 0.0237933539975288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9045871559633027,\n \"acc_stderr\": 0.012595899282335805,\n \"\
acc_norm\": 0.9045871559633027,\n \"acc_norm_stderr\": 0.012595899282335805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"\
acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426987,\n \"\
acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426987\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9282700421940928,\n \"acc_stderr\": 0.01679698961111959,\n \
\ \"acc_norm\": 0.9282700421940928,\n \"acc_norm_stderr\": 0.01679698961111959\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.028380391147094702,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.028380391147094702\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525995,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525995\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035196,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035196\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.03343270062869623,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.03343270062869623\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n\
\ \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822582,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822582\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.0202371490089909,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.0202371490089909\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8569604086845466,\n\
\ \"acc_stderr\": 0.012520023176796501,\n \"acc_norm\": 0.8569604086845466,\n\
\ \"acc_norm_stderr\": 0.012520023176796501\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n\
\ \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5687150837988827,\n\
\ \"acc_stderr\": 0.01656382939904771,\n \"acc_norm\": 0.5687150837988827,\n\
\ \"acc_norm_stderr\": 0.01656382939904771\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.022292858284568066,\n\
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.022292858284568066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n\
\ \"acc_stderr\": 0.022552447780478026,\n \"acc_norm\": 0.8038585209003215,\n\
\ \"acc_norm_stderr\": 0.022552447780478026\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.022021366100220194,\n\
\ \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.022021366100220194\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.02949482760014436,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.02949482760014436\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5691003911342895,\n\
\ \"acc_stderr\": 0.012647695889547214,\n \"acc_norm\": 0.5691003911342895,\n\
\ \"acc_norm_stderr\": 0.012647695889547214\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7757352941176471,\n \"acc_stderr\": 0.025336848563332372,\n\
\ \"acc_norm\": 0.7757352941176471,\n \"acc_norm_stderr\": 0.025336848563332372\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7875816993464052,\n \"acc_stderr\": 0.016547148636203147,\n \
\ \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.016547148636203147\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.031446603773522014,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.031446603773522014\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276915,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276915\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4785801713586291,\n\
\ \"mc1_stderr\": 0.017487432144711806,\n \"mc2\": 0.6380949314219707,\n\
\ \"mc2_stderr\": 0.015121732490251848\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.01115114504221832\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6118271417740713,\n \
\ \"acc_stderr\": 0.013423607564002757\n }\n}\n```"
repo_url: https://huggingface.co/llmixer/BigWeave-v16-103b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|arc:challenge|25_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|gsm8k|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hellaswag|10_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T07-02-03.874032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T07-02-03.874032.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- '**/details_harness|winogrande|5_2024-02-10T07-02-03.874032.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T07-02-03.874032.parquet'
- config_name: results
data_files:
- split: 2024_02_10T07_02_03.874032
path:
- results_2024-02-10T07-02-03.874032.parquet
- split: latest
path:
- results_2024-02-10T07-02-03.874032.parquet
---
# Dataset Card for Evaluation run of llmixer/BigWeave-v16-103b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [llmixer/BigWeave-v16-103b](https://huggingface.co/llmixer/BigWeave-v16-103b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llmixer__BigWeave-v16-103b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T07:02:03.874032](https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v16-103b/blob/main/results_2024-02-10T07-02-03.874032.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7291217373860504,
"acc_stderr": 0.029814128118071586,
"acc_norm": 0.7334267277522604,
"acc_norm_stderr": 0.030381307938227346,
"mc1": 0.4785801713586291,
"mc1_stderr": 0.017487432144711806,
"mc2": 0.6380949314219707,
"mc2_stderr": 0.015121732490251848
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407156,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.01385583128749773
},
"harness|hellaswag|10": {
"acc": 0.6992630950009958,
"acc_stderr": 0.0045764127139515,
"acc_norm": 0.8761202947619996,
"acc_norm_stderr": 0.003287709741128796
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930405,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930405
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7584905660377359,
"acc_stderr": 0.026341480371118352,
"acc_norm": 0.7584905660377359,
"acc_norm_stderr": 0.026341480371118352
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8819444444444444,
"acc_stderr": 0.026983346503309358,
"acc_norm": 0.8819444444444444,
"acc_norm_stderr": 0.026983346503309358
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321657,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7276595744680852,
"acc_stderr": 0.0291012906983867,
"acc_norm": 0.7276595744680852,
"acc_norm_stderr": 0.0291012906983867
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6228070175438597,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.6228070175438597,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5608465608465608,
"acc_stderr": 0.025559920550531013,
"acc_norm": 0.5608465608465608,
"acc_norm_stderr": 0.025559920550531013
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.02173254068932928,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.02173254068932928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6157635467980296,
"acc_stderr": 0.034223985656575515,
"acc_norm": 0.6157635467980296,
"acc_norm_stderr": 0.034223985656575515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865383,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865383
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424208,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424208
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.01932180555722317,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.01932180555722317
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7615384615384615,
"acc_stderr": 0.02160629449464773,
"acc_norm": 0.7615384615384615,
"acc_norm_stderr": 0.02160629449464773
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.03011444201966809,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.03011444201966809
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8403361344537815,
"acc_stderr": 0.0237933539975288,
"acc_norm": 0.8403361344537815,
"acc_norm_stderr": 0.0237933539975288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9045871559633027,
"acc_stderr": 0.012595899282335805,
"acc_norm": 0.9045871559633027,
"acc_norm_stderr": 0.012595899282335805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969426987,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969426987
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9282700421940928,
"acc_stderr": 0.01679698961111959,
"acc_norm": 0.9282700421940928,
"acc_norm_stderr": 0.01679698961111959
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094702,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094702
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525995,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525995
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035196,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035196
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869623,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869623
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6875,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822582,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822582
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.0202371490089909,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.0202371490089909
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8569604086845466,
"acc_stderr": 0.012520023176796501,
"acc_norm": 0.8569604086845466,
"acc_norm_stderr": 0.012520023176796501
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5687150837988827,
"acc_stderr": 0.01656382939904771,
"acc_norm": 0.5687150837988827,
"acc_norm_stderr": 0.01656382939904771
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.022292858284568066,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.022292858284568066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.022552447780478026,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.022552447780478026
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.022021366100220194,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.022021366100220194
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.02949482760014436,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.02949482760014436
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5691003911342895,
"acc_stderr": 0.012647695889547214,
"acc_norm": 0.5691003911342895,
"acc_norm_stderr": 0.012647695889547214
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7757352941176471,
"acc_stderr": 0.025336848563332372,
"acc_norm": 0.7757352941176471,
"acc_norm_stderr": 0.025336848563332372
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.016547148636203147,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.016547148636203147
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.031446603773522014,
"acc_norm": 0.89,
"acc_norm_stderr": 0.031446603773522014
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276915,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276915
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4785801713586291,
"mc1_stderr": 0.017487432144711806,
"mc2": 0.6380949314219707,
"mc2_stderr": 0.015121732490251848
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.01115114504221832
},
"harness|gsm8k|5": {
"acc": 0.6118271417740713,
"acc_stderr": 0.013423607564002757
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zjysteven/WikiMIA_concat | ---
dataset_info:
features:
- name: input
dtype: string
- name: label
sequence: int64
splits:
- name: WikiMIA_concat
num_bytes: 379618
num_examples: 387
download_size: 230624
dataset_size: 379618
configs:
- config_name: default
data_files:
- split: WikiMIA_concat
path: data/WikiMIA_concat-*
---
|
Janiele/pauloflores | ---
license: openrail
---
|
qgyd2021/tweets | ---
license: apache-2.0
---
## Tweets
Archive Team: The Twitter Stream Grab
https://archive.org/details/twitterstream
### Tweets With Emoji 数据集
数据来源:
```text
https://www.kaggle.com/datasets/ericwang1011/tweets-with-emoji
```
包含 emoji 表情的 tweets 推文。
用途:自动向文本中添加 emoji 表情。
示例:
| 样本数量 | 类别 | 例句1 | 例句2 |
| --- | --- | --- | --- |
| 20025 | backhand_index_pointing_right | 🧡@KeplerHomes AirdropBox event for #Arbitrum ecological users is here. A total of 550,000 addresses are eligible for #airdrop, and 5 types of AirDropbox with different scarcity can be issued.<br><br>💙Invitation code: 52DC39<br>🏆Airdrop Portal:👉 https://t.co/fudohu97uV | Remember, success in online business is a marathon, not a sprint. Keep at it, stay focused, and success will come." #patience #onlinebusiness #success<br>For more tips and Strategies, follow me 👉 @coach_lawrence1 https://t.co/IvtL9Om86J |
| 20000 | check_mark | Winner 🏆: @chinzhillaTG<br><br>Verify your win on @YOSHIYOCHIYUH, read his pinned tweet and send the needed details to his DM.<br><br>✔️ https://t.co/2Rugq2zCrg | @Rwpcity Double road. ✔️ Daily at 2:00am. |
| 20000 | check_mark_button | 💰 Denet Giveaway !💰 BIG Chalage <br><br>🏆 Reward:$8239.25 worth of $FB Tokens<br><br>✅ Follow <br>✅ Like & RT<br>✅ Complete DeNet tasks ⤵️<br>https://t.co/hHTFz4UKtO <br><br>🔔Tip: the more invites, the more you earn !<br><br>#Play2Earn #DOGE #Cryptos $ARB $USDT #eth #Giveaway #Airdrop #DeNet https://t.co/37jdSrXuyo | Countertop Ice Maker 🧊🧊<br><br>✅$82.99<br><br>❤️Clip $50Coupon<br><br>❤️5% off CODE: 1JTFQO0F <br><br>🔗https://t.co/HoI8cw9YgJ |
| 20000 | clown_face | Didn't that #CovidUK killer🤡 #Johnson do enough damage? Will folks be so gullible as to elect another devious establishment chancer @Keir_Starmer. if so #Nothing_will_Change | He is right.. there was/is no chaos 🤡 |
| 20000 | cooking | When you don’t like the person, it’s always like “what’s doing this one?”🫠🫠🫠🫠 | Still cooking 👨🍳 #uidesign #figma #UIUX https://t.co/ZvDpSO3LV1 |
| 20001 | egg | Happy Easter😀🐇🥚 | @elonmusk @teslaownersSV #eggs Easter eggs today will surely be delicious. #Eggs #Eggs #Eggs.🥚🥚🥚🍳🍳🍳 |
| 20000 | enraged_face | @Sandhillsrider @LoveAmerica615 @nygrlahart @Chriscarroll50 @1angryhillbilly @ZacharyIvanPor1 @PubliusNV @MaryfromMarin @GoldBaron08 @HPolisports @34FryingpanA22 @AugustusMcRae1 I would have gone after them fu*kers.😡😡😡 | @ChudsOfTikTok Award wages: DEFINITION 1. the smallest amount of money that an employer is legally allowed to pay for a particular type of work. <br><br>Of course. 🙄😡 |
| 20000 | eyes | @big_emtee Pshhh. There's no differences between races<br>👀 https://t.co/oDryND12Ar | For my TL<br>👀 |
| 20000 | face_holding_back_tears | @Fairy_Trades This mentality though 😂😤 But can you borrow me $1k😩🥹 | @crisgrdi i'll be looking forward to more of your tweets for this promo! it's really so soft 🥹 |
| 20000 | face_savoring_food | @PastorAlexLove Thank you, pastor. My mouth should get more....<br>Lol<br>I mean more cake...<br>Or do I? 🤤🤭😋🤣 | So horny right now, sending pics of my thick hard cock to every girl that dms 😋 <br>#horny #hard #cum #dick #cock #bwc #dmme #cocktribute #cumtribute #wankchat #wanktribute #nsfw #nsfwtwt #dickrate #tributeyou #sub #gavat #pasif |
| 20000 | face_with_steam_from_nose | @TheCaseySmith @DerrickEvans_WV Haha yep, get Trump😤<br>All of the problems in this country will be gone. Right? Poof! | So all the “ I’m from Cleveland “ baseball jackets sold out 😤 |
| 20000 | face_with_tears_of_joy | @mattyV_BOSS Something we can finally agree on 😂 | @namasoprop TBF, he was tryna calm him down before the yellow too😂 |
| 20000 | fearful_face | @sunnyhoney_kay bruh i’m from california and I didn’t know that 😨😨 | @spideramys i think this might’ve been a harry potter fanfiction following lily 💀<br><br>also how does hermione have such a smart meaning, then the other two protagonists are “harry” “ronald”<br><br>and of course “cho chang” exists 😨 |
| 20000 | fire | One of the things I hate most is the lie the fact of being deceived 😔 https://t.co/P9EHHwqO61 | I’ve been able to put my daughters hair in 2 full pig tails since she was 3 months old 😭 that’s insane |
| 20000 | folded_hands | If you know me you'll know I have wobbles, I hide, I don't speak and I fear the moment. I act odd but I bounce back . Sometimes I need to write it down to remind myself this is a blip not the final chapter there's more to this book of life for me. 🙏❤️ | @RepMattGaetz I personally don’t like you at all. Honestly. But if your a real Christian (even though I have my doubts considering who you like ). I will as a Christian wish you also a blessed and Happy Easter to you and your family. I’m not spreading hate on Easter Sunday. God Bless🙏 |
| 20000 | ghost | @DakotaLaden @ChelseaLaden @Tanner_Wiseman @Alex_Schroeder4 @ConnorStallings The support for the #ProjectFear kickstarter was so amazing and inspiring!! Thanks for reminding the #FearFam to never give up and good things happen to good people!! 👻😈 https://t.co/3bWpwsDY3C | @chillpillFTM @8play_games So SIK! This is the perfect way to $CHILL 💊🕹️🔥<br>$FTM #FTM https://t.co/SJcG4DRGCP |
| 20000 | grinning_face_with_sweat | @mikegapinski @TeslaAndroid Yeah, maybe that’s it. 😅 | Dj, but sidenote how's ty williams made the shortlist 😅 few excellent games but surely nowhere near our pots |
| 20000 | hatching_chick | Happy Easter 🐣 🐰 Everyone <br>Bath & Body Works 💙 Sales<br>40% OFF EVERYTHING <br>ALL MISTS & BODY CREAM $5.50<br>Not included in 40% off promotion.<br>JUST ADDED! ENDS AT 6PM ONLINE!<br>FREE SHIPPING ON $50<br>USED CODE - EASTERGIFT https://t.co/G2AyrbcGmV | Happy Easter friends! 🐣🐰💛 |
| 20000 | hot_face | @rambojr90 Then what’s this you posted? 🍯 😂 | That jawline can cut through anything….he’s so hot 🥵 ❤️🔥 |
| 20000 | loudly_crying_face | i thought i was gonna write notes this holy week tas i’m js watching true beauty 😭😭😭 | BUT WHAT ABOUT I HOPE I NEVER LOSE YOU HOPE THIS NEVER ENDSSSSSSS 😭😭😭😭😭 |
| 20000 | melting_face | This Han Jinsung with this Seo Changbin ‼️<br>😳🥴😵💫😵🫠 <br><br>credit to @/cheesechoux_cb for Changbins vid and to whomever took CB's pic https://t.co/AjyfpZ8ZQ0 | @anyatrades i dont like it when life brings me lemons, But @anyatrades can bring me lemons any day of the week 🫠 |
| 20000 | middle_finger | When He is busy at work I like to send him pics and videos to brighten up his day 😏😈 | @ryuuly 🖕 |
| 20001 | partying_face | The mafia boss Jimin agenda is thriving and I am here for it 😌https://t.co/RUDXh2u4WZ | We're still over the moon about our Leander office ribbon cutting! 🥳😍 A huge thanks to everyone who came out to support and celebrate our new office opening. We're proud to now offer GI care to the Leander community! 💙 @LeanderChamber @Christine_LTX 💙 https://t.co/emylqhQXBK https://t.co/wdHWJdQZe5 |
| 20000 | party_popper | @RedRosesForAme1 Thank you so much! 🎉💙 | 🎉Web3 Protocol X MetaStudio Giveaway<br><br> 🏆Prize Pool:- 5,000,000 $METAS + 200 USDT<br><br>To Enter:- <br>✔️Follow @Web3_Protocol & @MetaStudioLand<br>✔️Like and RT 3 friends<br>✔️ Fill form :- https://t.co/HHNA1MGw66<br><br>#daoforcreators #metaverseforcreators $metas |
| 20000 | pile_of_poo | y'all love saying this "y'AlL dOn'T kNoW hOw tHe GuBmEnT wOrKs" nonsensical 🐂💩 as if i can't call EVERYONE that's involved out. that man knew exactly what he did. | 👁💩👁<br> Sewage spills in your area mapped as Tories accused of ‘throwing in towel’ on leaks https://t.co/nqU7i6tcfg |
| 20000 | rabbit | @TaylorNasse Happy Easter, Ashleigh. ☕️🐇🥚💐❤️ | 😍🥰Wishing you the happiest of Easter🐇🫶🏾🐣🥳 https://t.co/bHeeH28lK6 |
| 20000 | rabbit_face | Check this out @Malissa_Longo When this young man Play's Michael Jackson Smooth Criminal on Broken piano Happy Easter 🐰 enjoy https://t.co/9hXWIRQYsV | @spicylife24 😂😂🤣子供の説明〜💦だいたいやもん🤣🤣🤣 |
| 20001 | red_heart | @chojiVAL true, some human interaction will truly do so many ppl good man. it’s honestly easier for me to connect w ppl thru talking then texting anymore 😭 | Liverpool 2-2 A.Ramsdale 🔥<br>#LIVARS |
| 20000 | rolling_on_the_floor_laughing | @itsvoltic1 NONE OF THESE MFS KNOW IM TOP 50 ALL STATS NA REALM🤣🤣🤣🤣 | @jsamchill 🤣 that’s exactly what I want!! It’s too many girls |
| 20000 | saluting_face | @HypeEth_ Gm legend 🤘❤️<br>Of course we are 🫡 | Have a Great Easter wkd @ripcache squad 🍳🫡 https://t.co/sKJFMKBdkv |
| 20000 | skull | Y’all gotta not put so much thought in the Super Mario Bros Movie 💀 | I ordered Coke Zero and they gave it to me in what looks like a plastic lassi da glass from the pind 💀 |
| 20000 | smiling_face | @eltoro_bro Im so mad i misssed it but it was so good to just see your name on my screen! ☺️ i cant wait to drive you crazy( i have a lot to make up for lol) | @drg357 Well you just continue getting better and get plenty of rest. I'm right here(on the other side of the universe ☺️) if you need a chat💞 https://t.co/AihkF5mJ9d |
| 20000 | smiling_face_with_halo | Goodnight everyone <3 remember to continue to stream 😇💜 https://t.co/nYJRkutITD | @KariLakeWarRoom @JenAFifield Congratulations mame Kari Lake you are also the beautiful face of all Americans. <br>💐💐💐<br>😇♥️🕊️🤗😘🌙🥰👌👍🙏🙏🙏 |
| 20000 | smiling_face_with_heart-eyes | The Daughter & Son 👫🐇😍 https://t.co/X6rnSP5zyD | @GoodPieceOfSass That's so cool!! 😍 If you ever track it down, please share. I'd get a kick out of that forsure. |
| 20000 | smiling_face_with_hearts | @BossfanAndrew 😂 oooo the year I intend to send only the one song due to my level of adoration for it 🥰 #forgotten80s | @honeyluved Stooop!! You're making me cry even more! 😭😭 And I kinda miss your voice now btw! 🥹🫰🫶 |
| 20000 | smiling_face_with_sunglasses | I love when that ass be so soft like Charmin boy 😎🤗🤯 | Rather be surrounded by dat water than some fuck niggas that's why I love dat beach 🏖️😎 |
| 20000 | smiling_face_with_tear | i was wondering why i hadnt seen any dunes vids yet then i remembered it's 3h behind 🥲 | @alive_without_u 🥲👍 |
| 20000 | sparkles | @CreoleBbyBritt This is why you’re my favorite! ✨ | Added this little one to my throne 🥺✨ |
| 20000 | sun | @Erdayastronaut I hope that in reality they will do the boostback burn the other way round! 😳 | Exotic smoke 🌿with the sun roof down type of day ☀️ |
| 20000 | thinking_face | @Ko_Sa_Ra_Chi_ Then it's a whale or porpoises and sharks CAN be lovely too 🤔 | @fkeyamo So your job is to sort 🤔 unemployment, but somewhat bothered more about Obi.<br>🤮 |
| 20000 | thumbs_up | @owenclark3 Fluffing hell, how peculiar 👍🙀 | @theSuiPunks @tocen__ 👍👍👍 |
| 20000 | white_heart | @aespa_official HAPPY B'DAY PRETTYYYYY 🤍<br><br>BLOOMING KARINA DAY<br>#지민아_마이의_푸른봄은_너야<br>#Welcome_to_MyKarina | Finally. It's wrap. Well done Enigma babies @primiilly1 @winmetawin and the whole team. Becareful on the way back home. Have a rest naa 🤍 |
| 20000 | winking_face | @Holy_Trinity_AV The Prince and Wales and Prince George weren't the only Villa royalty at the game I see Pete 😉Glad you finally got over. Don't leave it so long next time. Don Unai is creating something special here | @Answer4today @BeutelDory @krassenstein I’m middle class and my taxes went down, but thank you trailer trash who wants to voice an opinion for me. 😉 |
读取数据:
```python
#!/usr/bin/python3
# -*- coding: utf-8 -*-
from datasets import load_dataset
dataset = load_dataset(
"qgyd2021/tweets",
name="tweets_with_emoji",
split="train",
trust_remote_code=True,
streaming=True,
)
for sample in dataset:
print(sample)
``` |
unpredictable/unpredictable_unique | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-unique
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-unique" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Repository:** https://github.com/AnonCodeShare/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/unpredictable/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/unpredictable/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/unpredictable/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/unpredictable/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/unpredictable/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/unpredictable/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/unpredictable/unpredictable_support-google-com)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Licensing Information
Apache 2.0 |
DiogoAvalos/claudioduarte | ---
license: openrail
---
|
Sunbird/Experimental-Speech-Salt-Ateso-16k | ---
dataset_info:
features:
- name: audio
sequence:
sequence: float32
- name: sample_rate
dtype: int64
- name: transcription
dtype: string
- name: speaker_id
dtype: string
splits:
- name: train
num_bytes: 1726534428
num_examples: 4211
- name: validation
num_bytes: 96950913
num_examples: 231
- name: test
num_bytes: 105595730
num_examples: 250
download_size: 931410865
dataset_size: 1929081071
---
# Dataset Card for "Experimental-Speech-Salt-Ateso-16k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rinflan/sovits4.0 | ---
license: cc-by-nc-4.0
---
|
felipesampaio/darwin | ---
license: openrail
---
|
bosbos/falcon_large_data | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 16312071
num_examples: 10846
download_size: 9447072
dataset_size: 16312071
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vntc/wiki-full-corpus | ---
dataset_info:
features:
- name: metadata
struct:
- name: doc_id
dtype: string
- name: split
dtype: int64
- name: title
dtype: string
- name: token_count
dtype: int64
- name: url
dtype: string
- name: passage
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 1377054502
num_examples: 1639166
download_size: 605204760
dataset_size: 1377054502
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jlbaker361/wikiart20 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: style
dtype: string
- name: name
dtype: string
- name: gen_style
dtype: string
splits:
- name: train
num_bytes: 1166666.142857143
num_examples: 18
- name: test
num_bytes: 83966.85714285714
num_examples: 3
download_size: 1255245
dataset_size: 1250633.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
furry-br/lute | ---
license: openrail
---
|
Codec-SUPERB/libri2Mix_test_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 16215839
num_examples: 2000
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 16215839
num_examples: 2000
- name: academicodec_hifi_24k_320d
num_bytes: 24269183
num_examples: 2000
- name: audiodec_24k_320d
num_bytes: 51773695
num_examples: 2000
- name: dac_16k
num_bytes: 60908095
num_examples: 2000
- name: dac_24k
num_bytes: 243839551
num_examples: 2000
- name: dac_44k
num_bytes: 79082623
num_examples: 2000
- name: encodec_24k_12bps
num_bytes: 97014847
num_examples: 2000
- name: encodec_24k_1_5bps
num_bytes: 12209119
num_examples: 2000
- name: encodec_24k_24bps
num_bytes: 193935679
num_examples: 2000
- name: encodec_24k_3bps
num_bytes: 24324223
num_examples: 2000
- name: encodec_24k_6bps
num_bytes: 48554431
num_examples: 2000
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 129580607
num_examples: 2000
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 129580607
num_examples: 2000
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 129447999
num_examples: 2000
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 65020991
num_examples: 2000
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 129447999
num_examples: 2000
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 65020991
num_examples: 2000
- name: speech_tokenizer_16k
num_bytes: 32432511
num_examples: 2000
download_size: 234832275
dataset_size: 1548874829
---
# Dataset Card for "libri2Mix_test_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bjoernp/oscar2023_deduped_filtered_1.1 | ---
language:
- de
size_categories:
- 10M<n<100M
---
# Oscar 2023_01 DE Deduplicated
This is a filtered and deduplicated version of the german subset of the [23.01 OSCAR Corpus](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301), a large, crawled, and processed text dataset
curated by the OSCAR project (Open Super-large Crawled Aggregated coRpus).
OSCAR 23.01 is the January 2023 version of the OSCAR Corpus based on the November/December 2022 dump of Common Crawl.
While being quite similar to OSCAR 22.01, it contains several new features, including KenLM-based adult content detection, [...].
It was deduplicated using a MinHash implementation from the `text-dedup` library by `ChenghaoMou` available on [GitHub](https://github.com/ChenghaoMou/text-dedup). with the following command:
```bash
python -m text_dedup.minhash --path oscar-corpus/OSCAR-2301 --name "de" --cache_dir "../cache" --split "train" --column "text" --batch_size 10000 --output output/minhash_oscar_de_dedup
```
## Deduplication statistics
| Step | Runtime |
|---|---|
| Loading | 10.64s |
| MinHashing | 10574.02s |
| Clustering | 12187.65s |
| Filtering | 4198.70s |
| Saving | 3560.06s |
| Total | 30531.07s |
| Dataset | Number of documents |
|---|---|
| Before | 103299215 |
| After | 53172498 |
## Dataset scheme:
```json
{
"text":"English sentence\nphrase en français\n????????????", // (1)
"meta":{
"warc_headers":{ // (2)
"warc-identified-content-language":"fra,eng",
"warc-target-uri":"https://fr.wikipedia.org/wiki/...",
"warc-record-id":"<urn:uuid:29eaa920-d299-4b1d-b687-c72bd8d68116>",
"warc-type":"conversion",
"content-length":"35298", // (3)
"warc-refers-to":"<urn:uuid:39e42055-0d94-4e45-9c6c-9e7056635d64>",
"warc-block-digest":"sha1:WFH2A5WHCS2H365GIAFYQPI7UOAMFGHB", // (3)
"warc-date":"2022-11-26T09:45:47Z",
"content-type":"text/plain"
},
"identification":{ // (4)
"label":"fr",
"prob":0.8938327
},
"harmful_pp":4063.1814, // (5)
"tlsh":"tlsh:T125315FF2B6088901EEA097015DB39B4600B...", // (6)
"quality_warnings":[ // (7)
"short_sentences",
"header",
"footer"
],
"categories":[ // (8)
"examen_pix",
"liste_bu"
],
"sentence_identifications":[ // (9)
{
"label":"fr",
"prob":0.99837273
},
{
"label":"en",
"prob":0.9992377
},
null
]
}
}
```
## Filtering
Filtered with the following code (hyperparameters might vary slightly):
```python
from datasets import load_dataset, load_from_disk
import time
# Categories from https://dsi.ut-capitole.fr/blacklists/index_en.php
blocked_categories = set([
"adult", # Some adult site from erotic to hard pornography
"aggressif", # Sites that are aggressive or violent
"malware", # Any website which delivers malware
"phishing", # Same as above
"cryptojacking", # Mining site by hijacking
"dangerous_material", # Sites which describe how to make bomb and some dangerous material
])
# Blocked quality filters
blocked_quality_warnings = set([
"tiny", # The document has a low (≤ 5) number of lines
"short sentences", # The document has a high number (≥ 50%) of short lines
# "header", # Indicates that low-quality content could be present at the start of the document
# "footer", # Indicates that low-quality content could be present at the tail of the document
"noisy", # Indicates that the document is noisy
])
harmful_ppl_threshold = 500 # Determines the threshold for harmful ppl (lower is more harmful) TODO
language_prob_threshold = 0.9 # Determines the threshold for language identification (higher is more likely) TODO
blocked_urls = set([
"de.wikipedia.org", # Wikipedia (because we already have it)
"tagesschau.de", # Tagesschau (because we already have it)
])
def filter_content(example):
has_blocked_category = False
if "categories" in example["meta"] and example["meta"]["categories"] is not None:
has_blocked_category = len(set(example["meta"]["categories"]).intersection(blocked_categories)) > 0
has_blocked_quality_warnings = False
if "quality_warnings" in example["meta"] and example["meta"]["quality_warnings"] is not None:
has_blocked_quality_warnings = len(set(example["meta"]["quality_warnings"]).intersection(blocked_quality_warnings)) > 0
has_blocked_url = False
if "warc_headers" in example["meta"] and "warc-target-uri" in example["meta"]["warc_headers"] and example["meta"]["warc_headers"]["warc-target-uri"] is not None:
has_blocked_url = any([url in example["meta"]["warc_headers"]["warc-target-uri"] for url in blocked_urls])
has_harmful_ppl = example["meta"]["harmful_pp"] < harmful_ppl_threshold if "harmful_pp" in example["meta"] else False
has_bad_german_identification = example["meta"]["identification"]["prob"] < language_prob_threshold if "identification" in example["meta"] else True
return not (has_blocked_category or has_blocked_quality_warnings or has_blocked_url or has_harmful_ppl or has_bad_german_identification)
t_start = time.time()
ds = load_dataset("bjoernp/oscar2023_de_deduped", split="train", num_proc=128)
print(f"Loading took {time.time() - t_start}s")
print(f"Dataset size before filtering: {len(ds)}")
t_start = time.time()
ds = ds.filter(filter_content, num_proc=128)
print(f"Filtering took {time.time() - t_start}s")
print(f"Dataset size after filtering: {len(ds)}")
```
## Licensing
We follow the original licensing scheme of the Oscar Corpus.
(from the [OSCAR Corpus](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301). We cannot reasonably comply with takedown requests.):
```
These data are released under this licensing scheme
We do not own any of the text from which these data has been extracted.
We license the actual packaging, the metadata and the annotations of these data under the Creative Commons CC0 license ("no rights reserved") http://creativecommons.org/publicdomain/zero/1.0/
To the extent possible under law, the OSCAR project, Inria, the Univertity of Mannheim and DFKI GmbH have waived all copyright and related or neighboring rights to OSCAR
This work is published from: France and Germany.
[[[
Should you consider that our data contains material that is owned by you and should therefore not be reproduced here, please:
* Clearly identify yourself, with detailed contact data such as an address, telephone number or email address at which you can be contacted.
* Clearly identify the copyrighted work claimed to be infringed.
* Clearly identify the material that is claimed to be infringing and information reasonably sufficient to allow us to locate the material.
We will comply to legitimate requests by removing the affected sources from the next release of the corpus.
]]]
```
## Citation
```
@ARTICLE{2022arXiv221210440J,
author = {{Jansen}, Tim and {Tong}, Yangling and {Zevallos}, Victoria and {Ortiz Suarez}, Pedro},
title = "{Perplexed by Quality: A Perplexity-based Method for Adult and Harmful Content Detection in Multilingual Heterogeneous Web Data}",
journal = {arXiv e-prints},
keywords = {Computer Science - Computation and Language},
year = 2022,
month = dec,
eid = {arXiv:2212.10440},
pages = {arXiv:2212.10440},
doi = {10.48550/arXiv.2212.10440},
archivePrefix = {arXiv},
eprint = {2212.10440},
primaryClass = {cs.CL},
adsurl = {https://ui.adsabs.harvard.edu/abs/2022arXiv221210440J},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
@inproceedings{abadji-etal-2022-towards,
title = "Towards a Cleaner Document-Oriented Multilingual Crawled Corpus",
author = "Abadji, Julien and
Ortiz Suarez, Pedro and
Romary, Laurent and
Sagot, Beno{\^\i}t",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.463",
pages = "4344--4355",
abstract = "The need for large corpora raw corpora has dramatically increased in recent years with the introduction of transfer learning and semi-supervised learning methods to Natural Language Processing. And while there have been some recent attempts to manually curate the amount of data necessary to train large language models, the main way to obtain this data is still through automatic web crawling. In this paper we take the existing multilingual web corpus OSCAR and its pipeline Ungoliant that extracts and classifies data from Common Crawl at the line level, and propose a set of improvements and automatic annotations in order to produce a new document-oriented version of OSCAR that could prove more suitable to pre-train large generative language models as well as hopefully other applications in Natural Language Processing and Digital Humanities.",
}
@inproceedings{AbadjiOrtizSuarezRomaryetal.2021,
author = {Julien Abadji and Pedro Javier Ortiz Su{\'a}rez and Laurent Romary and Beno{\^i}t Sagot},
title = {Ungoliant: An optimized pipeline for the generation of a very large-scale multilingual web corpus},
series = {Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-9) 2021. Limerick, 12 July 2021 (Online-Event)},
editor = {Harald L{\"u}ngen and Marc Kupietz and Piotr Bański and Adrien Barbaresi and Simon Clematide and Ines Pisetta},
publisher = {Leibniz-Institut f{\"u}r Deutsche Sprache},
address = {Mannheim},
doi = {10.14618/ids-pub-10468},
url = {https://nbn-resolving.org/urn:nbn:de:bsz:mh39-104688},
pages = {1 -- 9},
year = {2021},
abstract = {Since the introduction of large language models in Natural Language Processing, large raw corpora have played a crucial role in Computational Linguistics. However, most of these large raw corpora are either available only for English or not available to the general public due to copyright issues. Nevertheless, there are some examples of freely available multilingual corpora for training Deep Learning NLP models, such as the OSCAR and Paracrawl corpora. However, they have quality issues, especially for low-resource languages. Moreover, recreating or updating these corpora is very complex. In this work, we try to reproduce and improve the goclassy pipeline used to create the OSCAR corpus. We propose a new pipeline that is faster, modular, parameterizable, and well documented. We use it to create a corpus similar to OSCAR but larger and based on recent data. Also, unlike OSCAR, the metadata information is at the document level. We release our pipeline under an open source license and publish the corpus under a research-only license.},
language = {en}
}
@article{kreutzer-etal-2022-quality,
title = "Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets",
author = {Kreutzer, Julia and
Caswell, Isaac and
Wang, Lisa and
Wahab, Ahsan and
van Esch, Daan and
Ulzii-Orshikh, Nasanbayar and
Tapo, Allahsera and
Subramani, Nishant and
Sokolov, Artem and
Sikasote, Claytone and
Setyawan, Monang and
Sarin, Supheakmungkol and
Samb, Sokhar and
Sagot, Beno{\^\i}t and
Rivera, Clara and
Rios, Annette and
Papadimitriou, Isabel and
Osei, Salomey and
Suarez, Pedro Ortiz and
Orife, Iroro and
Ogueji, Kelechi and
Rubungo, Andre Niyongabo and
Nguyen, Toan Q. and
M{\"u}ller, Mathias and
M{\"u}ller, Andr{\'e} and
Muhammad, Shamsuddeen Hassan and
Muhammad, Nanda and
Mnyakeni, Ayanda and
Mirzakhalov, Jamshidbek and
Matangira, Tapiwanashe and
Leong, Colin and
Lawson, Nze and
Kudugunta, Sneha and
Jernite, Yacine and
Jenny, Mathias and
Firat, Orhan and
Dossou, Bonaventure F. P. and
Dlamini, Sakhile and
de Silva, Nisansa and
{\c{C}}abuk Ball{\i}, Sakine and
Biderman, Stella and
Battisti, Alessia and
Baruwa, Ahmed and
Bapna, Ankur and
Baljekar, Pallavi and
Azime, Israel Abebe and
Awokoya, Ayodele and
Ataman, Duygu and
Ahia, Orevaoghene and
Ahia, Oghenefego and
Agrawal, Sweta and
Adeyemi, Mofetoluwa},
journal = "Transactions of the Association for Computational Linguistics",
volume = "10",
year = "2022",
address = "Cambridge, MA",
publisher = "MIT Press",
url = "https://aclanthology.org/2022.tacl-1.4",
doi = "10.1162/tacl_a_00447",
pages = "50--72",
abstract = "With the success of large-scale pre-training and multilingual modeling in Natural Language Processing (NLP), recent years have seen a proliferation of large, Web-mined text datasets covering hundreds of languages. We manually audit the quality of 205 language-specific corpora released with five major public datasets (CCAligned, ParaCrawl, WikiMatrix, OSCAR, mC4). Lower-resource corpora have systematic issues: At least 15 corpora have no usable text, and a significant fraction contains less than 50{\%} sentences of acceptable quality. In addition, many are mislabeled or use nonstandard/ambiguous language codes. We demonstrate that these issues are easy to detect even for non-proficient speakers, and supplement the human audit with automatic analyses. Finally, we recommend techniques to evaluate and improve multilingual corpora and discuss potential risks that come with low-quality data releases.",
}
@inproceedings{ortiz-suarez-etal-2020-monolingual,
title = "A Monolingual Approach to Contextualized Word Embeddings for Mid-Resource Languages",
author = "Ortiz Su{'a}rez, Pedro Javier and
Romary, Laurent and
Sagot, Benoit",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-main.156",
pages = "1703--1714",
abstract = "We use the multilingual OSCAR corpus, extracted from Common Crawl via language classification, filtering and cleaning, to train monolingual contextualized word embeddings (ELMo) for five mid-resource languages. We then compare the performance of OSCAR-based and Wikipedia-based ELMo embeddings for these languages on the part-of-speech tagging and parsing tasks. We show that, despite the noise in the Common-Crawl-based OSCAR data, embeddings trained on OSCAR perform much better than monolingual embeddings trained on Wikipedia. They actually equal or improve the current state of the art in tagging and parsing for all five languages. In particular, they also improve over multilingual Wikipedia-based contextual embeddings (multilingual BERT), which almost always constitutes the previous state of the art, thereby showing that the benefit of a larger, more diverse corpus surpasses the cross-lingual benefit of multilingual embedding architectures.",
}
@inproceedings{OrtizSuarezSagotRomary2019,
author = {Pedro Javier {Ortiz Su{'a}rez} and Benoit Sagot and Laurent Romary},
title = {Asynchronous pipelines for processing huge corpora on medium to low resource infrastructures},
series = {Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-7) 2019. Cardiff, 22nd July 2019},
editor = {Piotr Bański and Adrien Barbaresi and Hanno Biber and Evelyn Breiteneder and Simon Clematide and Marc Kupietz and Harald L{"u}ngen and Caroline Iliadi},
publisher = {Leibniz-Institut f{"u}r Deutsche Sprache},
address = {Mannheim},
doi = {10.14618/ids-pub-9021},
url = {http://nbn-resolving.de/urn:nbn:de:bsz:mh39-90215},
pages = {9 -- 16},
year = {2019},
abstract = {Common Crawl is a considerably large, heterogeneous multilingual corpus comprised of crawled documents from the internet, surpassing 20TB of data and distributed as a set of more than 50 thousand plain text files where each contains many documents written in a wide variety of languages. Even though each document has a metadata block associated to it, this data lacks any information about the language in which each document is written, making it extremely difficult to use Common Crawl for monolingual applications. We propose a general, highly parallel, multithreaded pipeline to clean and classify Common Crawl by language; we specifically design it so that it runs efficiently on medium to low resource infrastructures where I/O speeds are the main constraint. We develop the pipeline so that it can be easily reapplied to any kind of heterogeneous corpus and so that it can be parameterised to a wide range of infrastructures. We also distribute a 6.3TB version of Common Crawl, filtered, classified by language, shuffled at line level in order to avoid copyright issues, and ready to be used for NLP applications.},
language = {en}
}
``` |
lucadiliello/textbookqa | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: key
dtype: string
- name: labels
list:
- name: end
sequence: int64
- name: start
sequence: int64
splits:
- name: test
num_bytes: 5371294
num_examples: 1503
download_size: 802199
dataset_size: 5371294
---
# Dataset Card for "textbookqa"
Split taken from the MRQA 2019 Shared Task, formatted and filtered for Question Answering. For the original dataset, have a look [here](https://huggingface.co/datasets/mrqa). |
argilla/dpo-mix-7k | ---
language:
- en
license: mit
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: dataset
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen_rating
dtype: float64
- name: rejected_rating
dtype: float64
splits:
- name: train
num_bytes: 41362946
num_examples: 6750
- name: test
num_bytes: 4586808
num_examples: 750
download_size: 24232011
dataset_size: 45949754
tags:
- distilabel
- synthetic
- dpo
---
# Argilla DPO Mix 7K Dataset
> A small cocktail combining DPO datasets built by Argilla with [distilabel](https://github.com/argilla-io/distilabel). The goal of this dataset is having a small, high-quality DPO dataset by filtering only highly rated chosen responses.
<div>
<img src="https://cdn-uploads.huggingface.co/production/uploads/60420dccc15e823a685f2b03/Csd2-zPji7iwIxyz6UFe1.webp">
</div>
<p align="center">
<a href="https://github.com/argilla-io/distilabel">
<img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/>
</a>
</p>
## Datasets mixed
As already mentioned, this dataset mixes the following datasets:
* [`argilla/distilabel-capybara-dpo-7k-binarized`](https://huggingface.co/datasets/argilla/distilabel-capybara-dpo-7k-binarized): random sample of highly scored chosen responses (>=4).
* [`argilla/distilabel-intel-orca-dpo-pairs`](https://huggingface.co/datasets/argilla/distilabel-intel-orca-dpo-pairs): random sample of highly scored chosen responses (>=8).
* [`argilla/ultrafeedback-binarized-preferences-cleaned`](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned): random sample of highly scored chosen responses (>=4).
The samples have been randomly selected from the original datasets with a proportion of 0.33 each, as can be seen via the `dataset` column of the dataset.
## Next steps
* Adding more samples
* Use data selection techniques to improve the diversity, usefulness, and complexity of the dataset. |
benayas/atis_chatgpt_5pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 442917
num_examples: 4455
download_size: 146969
dataset_size: 442917
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yzhuang/autotree_snnxor_n15_l2_10 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 484120000
num_examples: 10000
- name: validation
num_bytes: 484120000
num_examples: 10000
- name: test
num_bytes: 484120000
num_examples: 10000
download_size: 597791512
dataset_size: 1452360000
---
# Dataset Card for "autotree_snnxor_n15_l2_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonathan-roberts1/SATIN | ---
license: other
configs:
- config_name: SAT-4
- config_name: SAT-6
- config_name: NASC-TG2
- config_name: WHU-RS19
- config_name: RSSCN7
- config_name: RS_C11
- config_name: SIRI-WHU
- config_name: EuroSAT
- config_name: NWPU-RESISC45
- config_name: PatternNet
- config_name: RSD46-WHU
- config_name: GID
- config_name: CLRS
- config_name: Optimal-31
- config_name: Airbus-Wind-Turbines-Patches
- config_name: USTC_SmokeRS
- config_name: Canadian_Cropland
- config_name: Ships-In-Satellite-Imagery
- config_name: Satellite-Images-of-Hurricane-Damage
- config_name: Brazilian_Coffee_Scenes
- config_name: Brazilian_Cerrado-Savanna_Scenes
- config_name: Million-AID
- config_name: UC_Merced_LandUse_MultiLabel
- config_name: MLRSNet
- config_name: MultiScene
- config_name: RSI-CB256
- config_name: AID_MultiLabel
task_categories:
- image-classification
- zero-shot-image-classification
pretty_name: SATellite ImageNet
size_categories:
- 100K<n<1M
language:
- en
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:** [https://satinbenchmark.github.io](https://satinbenchmark.github.io)
- **Repository:**
- **Paper:** [SATIN: A Multi-Task Metadataset for Classifying Satellite Imagery using Vision-Language Models](https://arxiv.org/pdf/2304.11619.pdf)
- **Leaderboard:** [SATIN Leaderboard](https://satinbenchmark.github.io/leaderboard.md)
### Dataset Summary
SATIN (SATellite ImageNet) is a metadataset containing 27 constituent satellite and aerial image datasets spanning 6 distinct tasks: Land Cover, Land Use,
Hierarchical Land Use, Complex Scenes, Rare Scenes, and False Colour Scenes. The imagery is globally distributed, comprised of resolutions spanning 5 orders
of magnitude, multiple fields of view sizes, and over 250 distinct class labels. Presented at ICCV '23 TNGCV Workshop.
## Dataset Structure
The SATIN benchmark is comprised of the following datasets:
#### Task 1: Land Cover
- SAT-4
- SAT-6
- NASC-TG2
#### Task 2: Land Use
- WHU-RS19
- RSSCN7
- RS_C11
- SIRI-WHU
- EuroSAT
- NWPU-RESISC45
- PatternNet
- RSD46-WHU
- GID
- CLRS
- Optimal-31
#### Task 3: Hierarchical Land Use
- Million-AID
- RSI-CB256
#### Task 4: Complex Scenes
- UC_Merced_LandUse_MultiLabel
- MLRSNet
- MultiScene
- AID_MultiLabel
#### Task 5: Rare Scenes
- Airbus-Wind-Turbines-Patches
- USTC_SmokeRS
- Canadian_Cropland
- Ships-In-Satellite-Imagery
- Satellite-Images-of-Hurricane-Damage
#### Task 6: False Colour Scenes
- Brazilian_Coffee_Scenes
- Brazilian_Cerrado-Savanna_Scenes
For ease of use and to avoid having to download the entire benchmark for each use, in this dataset repository, each of the 27 datasets is included as a separate
'config'.
### Example Usage
```python
from datasets import load_dataset
hf_dataset = load_dataset('jonathan-roberts1/SATIN', DATASET_NAME, split='train') # for DATASET_NAME use one of the configs listed above (e.g., EuroSAT)
features = hf_dataset.features
class_labels = features['label'].names
#class_labels = features['label'].feature.names # for the Complex Scenes datasets
#class_labels_1 = features['label_1'].names # for the Hierarchical Land Use datasets, the label field is replaced with label_1, label_2, ...
random_index = 5
example = hf_dataset[random_index]
image, label = example['image'], example['label']
```
### Data Splits
For each config, there is just the single, default 'train' split.
### Source Data
More information regarding the source data can be found in our paper. Additionally, each of the constituent datasets have been uploaded to HuggingFace datasets.
They can be accessed at: huggingface.co/datasets/jonathan-roberts1/DATASET_NAME.
### Dataset Curators
This dataset was curated by Jonathan Roberts, Kai Han, and Samuel Albanie
### Licensing Information
As SATIN is comprised of existing datasets with differing licenses, there is not a single license for SATIN. All of the datasets in SATIN can be used
for research purposes; usage information of specific constituent datasets can be found in the Appendix of our paper.
### Citation Information
```
@article{roberts2023satin,
title = {SATIN: A Multi-Task Metadataset for Classifying Satellite Imagery using Vision-Language Models},
author = {Jonathan Roberts, Kai Han, and Samuel Albanie},
year = {2023},
eprint = {2304.11619},
archivePrefix= {arXiv},
primaryClass = {cs.CV}
}
``` |
jlbaker361/cyberpunk-1000-cropped | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: frame
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 53110410.0
num_examples: 243
download_size: 53102137
dataset_size: 53110410.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-samsum-samsum-08013b-2758881773 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: pszemraj/tglobal-large-booksum-WIP4-r1
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/tglobal-large-booksum-WIP4-r1
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
nisancoskun/finnish_sentiment_data | ---
license: mit
task_categories:
- text-classification
language:
- fi
source_datasets:
- sepidmnorozy/Finnish_sentiment
- https://github.com/cynarr/sentiment-analysis
size_categories:
- 10K<n<100K
--- |
pykeio/vtuber-chats-4.5m | ---
license: apache-2.0
language:
- ja
- en
- ko
- zh
- id
- tl
tags:
- livestream
- stream
pretty_name: VTuber Chats 4.5M
size_categories:
- 1M<n<10M
---
# VTuber Chats 4.5M
A dataset of 4,562,579 chat messages collected from various Hololive and Nijisanji YouTube live streams.
Note that the provided language detection values can be very inaccurate on shorter messages and should not be depended on. |
zolak/twitter_dataset_80_1713173045 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 285410
num_examples: 672
download_size: 145346
dataset_size: 285410
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_189 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1308172460.0
num_examples: 254905
download_size: 1338015185
dataset_size: 1308172460.0
---
# Dataset Card for "chunk_189"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Censius-AI/ECommerce-Women-Clothing-Reviews | ---
license: apache-2.0
---
|
joey234/mmlu-miscellaneous-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 237559
num_examples: 783
download_size: 153226
dataset_size: 237559
---
# Dataset Card for "mmlu-miscellaneous-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sleoruiz/speeches-congre-clean-names | ---
dataset_info:
features:
- name: text
dtype: string
- name: gaceta_numero
dtype: string
- name: fecha_gaceta
dtype: string
- name: comision
dtype: string
- name: name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 181327260
num_examples: 94501
download_size: 92131968
dataset_size: 181327260
---
# Dataset Card for "speeches-congre-clean-names"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sheik21/kevin | ---
license: openrail
---
|
mmaak/medical_meadow_medqa_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10499824
num_examples: 10178
download_size: 5460295
dataset_size: 10499824
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AescF/common_language_preprocessed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: sentence
dtype: string
- name: age
dtype: string
- name: gender
dtype: string
- name: label
dtype:
class_label:
names:
'0': Arabic
'1': Basque
'2': Breton
'3': Catalan
'4': Chinese_China
'5': Chinese_Hongkong
'6': Chinese_Taiwan
'7': Chuvash
'8': Czech
'9': Dhivehi
'10': Dutch
'11': English
'12': Esperanto
'13': Estonian
'14': French
'15': Frisian
'16': Georgian
'17': German
'18': Greek
'19': Hakha_Chin
'20': Indonesian
'21': Interlingua
'22': Italian
'23': Japanese
'24': Kabyle
'25': Kinyarwanda
'26': Kyrgyz
'27': Latvian
'28': Maltese
'29': Mangolian
'30': Persian
'31': Polish
'32': Portuguese
'33': Romanian
'34': Romansh_Sursilvan
'35': Russian
'36': Sakha
'37': Slovenian
'38': Spanish
'39': Swedish
'40': Tamil
'41': Tatar
'42': Turkish
'43': Ukranian
'44': Welsh
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 13848986619
num_examples: 22194
- name: validation
num_bytes: 3461442109
num_examples: 5888
- name: test
num_bytes: 3473659131
num_examples: 5963
download_size: 0
dataset_size: 20784087859
---
# Dataset Card for "common_language_preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bezzam/DigiCam-Mirflickr-SingleMask-1K | ---
license: mit
dataset_info:
features:
- name: lensless
dtype: image
- name: lensed
dtype: image
splits:
- name: train
num_bytes: 400976354.0
num_examples: 850
- name: test
num_bytes: 70756509.0
num_examples: 150
download_size: 471722260
dataset_size: 471732863.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AdapterOcean/data-standardized_cluster_22 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 130911576
num_examples: 12737
download_size: 37520503
dataset_size: 130911576
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_22"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sadiksha/sentiment_analysis_data | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1741533
num_examples: 16000
- name: test
num_bytes: 217173
num_examples: 2000
- name: valid
num_bytes: 214695
num_examples: 2000
download_size: 1286836
dataset_size: 2173401
---
# Dataset Card for "sentiment_analysis_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/nyx_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nyx (Fire Emblem)
This is the dataset of nyx (Fire Emblem), containing 58 images and their tags.
The core tags of this character are `black_hair, long_hair, facial_mark, breasts, red_eyes, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 58 | 61.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyx_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 58 | 37.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyx_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 122 | 71.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyx_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 58 | 54.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyx_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 122 | 97.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nyx_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nyx_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 58 |  |  |  |  |  | 1girl, solo, forehead_mark, looking_at_viewer, cape, bodystocking, simple_background, tiara, mouth_veil, covered_navel, book, cleavage, thighhighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | forehead_mark | looking_at_viewer | cape | bodystocking | simple_background | tiara | mouth_veil | covered_navel | book | cleavage | thighhighs | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------------|:--------------------|:-------|:---------------|:--------------------|:--------|:-------------|:----------------|:-------|:-----------|:-------------|:-------------------|
| 0 | 58 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
tjaffri/wikisql-generate | ---
license: bsd-3-clause
dataset_info:
features:
- name: input
dtype: string
- name: table_info
dtype: string
- name: sql_cmd
dtype: string
splits:
- name: test
num_bytes: 9526974
num_examples: 15462
- name: validation
num_bytes: 5034756
num_examples: 8243
- name: train
num_bytes: 33996901
num_examples: 54963
download_size: 11329076
dataset_size: 48558631
---
# WikiSQL Dataset (Reformatted for Generative Models)
This is the exact same dataset as WikiSQL: https://huggingface.co/datasets/wikisql, but with the data reformatted to allow direct use with text generation LLMs. The original license and credits for the original dataset remain in place.
Specifically, the changes from standard WikiSQL are:
1. The table details in WikiSQL were included as dictionaries but tools like [LangChain](https://python.langchain.com/en/latest/modules/chains/examples/sqlite.html) and [LlamaIndex](https://medium.com/llamaindex-blog/combining-text-to-sql-with-semantic-search-for-retrieval-augmented-generation-c60af30ec3b) build their prompts using a SQL DESCRIBE of the tables, which is included in this dataset as the table_info.
1. In addition, some of the SQL commands in WikiSQL that were not syntactically valid (e.g. due to identifiers not quoted) were removed. Specifically, we created in-memory (SQLite) tables using the SQL DESCRIBE of the tables, then ran the WikiSQL human readable SQL query against these in-memory tables. Any SQL queries that threw exceptions for any reason were discarded, and the rest that ran without exceptions were included in this dataset as the sql_cmd.
1. The SQL queries under sql_cmd were also formatted to capitalize keywords and do other pretty printing of the SQL using [SQLParse](https://sqlparse.readthedocs.io/en/latest/) to make the SQL more standard and easier to learn for smaller models.
# Suggested Uses
This dataset may be used for the following purposes:
1. Combine SQL queries with text based retrieval, using techniques like the [LlamaIndex SQLAutoVectorQueryEngine](https://gpt-index.readthedocs.io/en/latest/examples/query_engine/SQLAutoVectorQueryEngine.html).
1. Fine tuning LLMs to generate SQL commands from natural language inputs, given SQL DESCRIBE of tables and various rows. This is exactly the use case for the [LangChain](https://python.langchain.com/en/latest/modules/chains/examples/sqlite.html) SQLChain, so once fine tuned these LLMs may be used directly with these chains for theoretically better results (not tried at the time of writing)
1. Few shot prompt seeding of LLMs used to generate SQL commands from natural language inputs.
|
faruk/bengali-names-vs-gender | ---
license: afl-3.0
---
# Bengali Female VS Male Names Dataset
An NLP dataset that contains 2030 data samples of Bengali names and corresponding gender both for female and male. This is a very small and simple toy dataset that can be used by NLP starters to practice sequence classification problem and other NLP problems like gender recognition from names.
# Background
In Bengali language, name of a person is dependent largely on their gender. Normally, name of a female ends with certain type of suffix "A", "I", "EE" ["আ", "ই", "ঈ"]. And the names of male are significantly different from female in terms of phoneme patterns and ending suffix. So, In my observation there is a significant possibility that these difference in patterns can be used for gender classification based on names.
Find the full documentation here:
[Documentation and dataset specifications](https://github.com/faruk-ahmad/bengali-female-vs-male-names)
## Dataset Format
The dataset is in CSV format. There are two columns- namely
1. Name
2. Gender
Each row has two attributes. First one is name, second one is the gender. The name attribute is in ```utf-8``` encoding. And the second attribute i.e. the gender attribute has been signified by 0 and 1 as
| | |
|---|---|
|male| 0|
|female| 1|
| | |
## Dataset Statistics
The number of samples per class is as bellow-
| | |
|---|---|
|male| 1029|
|female| 1001|
| | |
## Possible Use Cases
1. Sequence Classification using RNN, LSTM etc
2. Sequence modeling using other type of machine learning algorithms
3. Gender recognition based on names
## Disclaimer
The names were collected from internet using different sources like wikipedia, baby name suggestion websites, blogs etc. If someones name is in the dataset, that is totally unintentional. |
Nekochu/discord-unstable-diffusion-SD-prompts | ---
license: apache-2.0
---
## Dataset Description
List of SD prompt in alpaca format from the discord server mostly from "Unstable Diffusion", include "Umi AI, Aitrepreneur, Softology"
Detailed alpaca format: [system context optional]\n\n### Instruction:\nCreate stable diffusion metadata based on the given english description. [brief description of prompt]\n### Input:\n[one channel Discord](https://pastebin.com/07PuBaQp),\n### Response:\n
#### Data Collection and Processing
11/2023 creation Dataset [DiscordPromptSD.json](https://huggingface.co/datasets/Nekochu/discord-unstable-diffusion-SD-prompts/blob/main/DiscordPromptSD.json) and tools:
- DiscordChatExporter to bulk download and keep only prompt image with metadata to "output" and channel name as "input".
- Captioning used for "instruction": ViT-L-14/openai (pharmapsychotic/clip-interrogator-ext), else spacy summarizer
- Kainet Editor and my (test) script [scrapt](https://pastebin.com/w8qPjjiL), [format](https://pastebin.com/gGmDrmjX)[_](https://pastebin.com/VtG9LSuG), [dedup](https://pastebin.com/zZWaH4V3) for misc replace.
03/2024 Dataset [ExtendedPrompts.json](https://huggingface.co/datasets/Nekochu/discord-unstable-diffusion-SD-prompts/blob/main/ExtendedPrompts.json) Colletion not by me:
- To combined dataset from [neuralworm](https://huggingface.co/datasets/neuralworm/stable-diffusion-discord-prompts), [sengunsipahi](https://huggingface.co/datasets/sengunsipahi/civitai_top10k), [Ar4ikov](https://huggingface.co/datasets/Ar4ikov/civitai_sd_337_prompts), [thefcraft](https://huggingface.co/datasets/thefcraft/civitai-stable-diffusion-337k), [xzuyn](https://huggingface.co/datasets/xzuyn/Stable-Diffusion-Prompts-Deduped-2.008M), [Gustavosta](https://huggingface.co/datasets/Gustavosta/Stable-Diffusion-Prompts), [MadVoyager](https://huggingface.co/datasets/MadVoyager/stable_diffusion_instructional_dataset), I used my [fine-tuned Bart model](https://huggingface.co/Nekochu/distilbart-cnn-12-6-SD-prompt) to create a short summary prompt to complete the missing "Input" instructions for a Alpaca template, and is not part of [Luminia v3](https://huggingface.co/Nekochu/Luminia-13B-v3). |
Tong0217/common_language | ---
license: openrail
---
|
ybelkada/model_cards_correct_tag | ---
dataset_info:
features:
- name: commit_dates
dtype: string
- name: total_transformers_model
dtype: int64
- name: missing_library_name
dtype: int64
splits:
- name: train
num_bytes: 1620
num_examples: 54
download_size: 3008
dataset_size: 1620
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/mmarco_v2_es_train | ---
pretty_name: '`mmarco/v2/es/train`'
viewer: false
source_datasets: ['irds/mmarco_v2_es']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/es/train`
The `mmarco/v2/es/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/es/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=808,731
- `qrels`: (relevance assessments); count=532,761
- `docpairs`; count=39,780,811
- For `docs`, use [`irds/mmarco_v2_es`](https://huggingface.co/datasets/irds/mmarco_v2_es)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_v2_es_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_v2_es_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
docpairs = load_dataset('irds/mmarco_v2_es_train', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
autoevaluate/autoeval-staging-eval-project-57377e87-7975067 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- food101
eval_info:
task: image_multi_class_classification
model: aspis/swin-finetuned-food101
metrics: []
dataset_name: food101
dataset_config: default
dataset_split: validation
col_mapping:
image: image
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Image Classification
* Model: aspis/swin-finetuned-food101
* Dataset: food101
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
autoevaluate/autoeval-eval-phpthinh__exampletx-toxic-7252ee-1708159804 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/exampletx
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b7
metrics: []
dataset_name: phpthinh/exampletx
dataset_config: toxic
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b7
* Dataset: phpthinh/exampletx
* Config: toxic
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
freshpearYoon/train_free_39 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604561592
num_examples: 10000
download_size: 1248571418
dataset_size: 9604561592
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_vanillaOVO__supermario_v4 | ---
pretty_name: Evaluation run of vanillaOVO/supermario_v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vanillaOVO/supermario_v4](https://huggingface.co/vanillaOVO/supermario_v4) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vanillaOVO__supermario_v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T22:55:06.227389](https://huggingface.co/datasets/open-llm-leaderboard/details_vanillaOVO__supermario_v4/blob/main/results_2024-02-01T22-55-06.227389.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6599989129866183,\n\
\ \"acc_stderr\": 0.03192841805798971,\n \"acc_norm\": 0.6593861923643444,\n\
\ \"acc_norm_stderr\": 0.03259944262143704,\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.017297421448534744,\n \"mc2\": 0.7206547057471042,\n\
\ \"mc2_stderr\": 0.014737356055250207\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266129,\n\
\ \"acc_norm\": 0.734641638225256,\n \"acc_norm_stderr\": 0.012902554762313957\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7123083051185023,\n\
\ \"acc_stderr\": 0.004517614647703243,\n \"acc_norm\": 0.8876717785301733,\n\
\ \"acc_norm_stderr\": 0.003151244960241657\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188716,\n \"\
acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188716\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297794,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297794\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371805,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371805\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"\
acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.017297421448534744,\n \"mc2\": 0.7206547057471042,\n\
\ \"mc2_stderr\": 0.014737356055250207\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.009968715765479646\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \
\ \"acc_stderr\": 0.012607137125693633\n }\n}\n```"
repo_url: https://huggingface.co/vanillaOVO/supermario_v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|arc:challenge|25_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|gsm8k|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hellaswag|10_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-55-06.227389.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T22-55-06.227389.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- '**/details_harness|winogrande|5_2024-02-01T22-55-06.227389.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T22-55-06.227389.parquet'
- config_name: results
data_files:
- split: 2024_02_01T22_55_06.227389
path:
- results_2024-02-01T22-55-06.227389.parquet
- split: latest
path:
- results_2024-02-01T22-55-06.227389.parquet
---
# Dataset Card for Evaluation run of vanillaOVO/supermario_v4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vanillaOVO/supermario_v4](https://huggingface.co/vanillaOVO/supermario_v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vanillaOVO__supermario_v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T22:55:06.227389](https://huggingface.co/datasets/open-llm-leaderboard/details_vanillaOVO__supermario_v4/blob/main/results_2024-02-01T22-55-06.227389.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6599989129866183,
"acc_stderr": 0.03192841805798971,
"acc_norm": 0.6593861923643444,
"acc_norm_stderr": 0.03259944262143704,
"mc1": 0.576499388004896,
"mc1_stderr": 0.017297421448534744,
"mc2": 0.7206547057471042,
"mc2_stderr": 0.014737356055250207
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266129,
"acc_norm": 0.734641638225256,
"acc_norm_stderr": 0.012902554762313957
},
"harness|hellaswag|10": {
"acc": 0.7123083051185023,
"acc_stderr": 0.004517614647703243,
"acc_norm": 0.8876717785301733,
"acc_norm_stderr": 0.003151244960241657
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297794,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371805,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371805
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.576499388004896,
"mc1_stderr": 0.017297421448534744,
"mc2": 0.7206547057471042,
"mc2_stderr": 0.014737356055250207
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.009968715765479646
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693633
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_grimjim__kukulemon-7B | ---
pretty_name: Evaluation run of grimjim/kukulemon-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [grimjim/kukulemon-7B](https://huggingface.co/grimjim/kukulemon-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_grimjim__kukulemon-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-12T06:10:18.406525](https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kukulemon-7B/blob/main/results_2024-03-12T06-10-18.406525.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530505561930542,\n\
\ \"acc_stderr\": 0.032076888103327796,\n \"acc_norm\": 0.654916222472434,\n\
\ \"acc_norm_stderr\": 0.032718314960330744,\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.6199218454707466,\n\
\ \"mc2_stderr\": 0.015289736195923467\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726096,\n\
\ \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277366\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6824337781318462,\n\
\ \"acc_stderr\": 0.004645783048004575,\n \"acc_norm\": 0.8609838677554272,\n\
\ \"acc_norm_stderr\": 0.0034525630964691366\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642507,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642507\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n\
\ \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546837,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546837\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.01637696614261008,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.01637696614261008\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.0279715413701706,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.0279715413701706\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.6199218454707466,\n\
\ \"mc2_stderr\": 0.015289736195923467\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386783\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6103108415466262,\n \
\ \"acc_stderr\": 0.01343312323611072\n }\n}\n```"
repo_url: https://huggingface.co/grimjim/kukulemon-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|arc:challenge|25_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|gsm8k|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hellaswag|10_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T06-10-18.406525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T06-10-18.406525.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- '**/details_harness|winogrande|5_2024-03-12T06-10-18.406525.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-12T06-10-18.406525.parquet'
- config_name: results
data_files:
- split: 2024_03_12T06_10_18.406525
path:
- results_2024-03-12T06-10-18.406525.parquet
- split: latest
path:
- results_2024-03-12T06-10-18.406525.parquet
---
# Dataset Card for Evaluation run of grimjim/kukulemon-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [grimjim/kukulemon-7B](https://huggingface.co/grimjim/kukulemon-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_grimjim__kukulemon-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-12T06:10:18.406525](https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kukulemon-7B/blob/main/results_2024-03-12T06-10-18.406525.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530505561930542,
"acc_stderr": 0.032076888103327796,
"acc_norm": 0.654916222472434,
"acc_norm_stderr": 0.032718314960330744,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.6199218454707466,
"mc2_stderr": 0.015289736195923467
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726096,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.013659980894277366
},
"harness|hellaswag|10": {
"acc": 0.6824337781318462,
"acc_stderr": 0.004645783048004575,
"acc_norm": 0.8609838677554272,
"acc_norm_stderr": 0.0034525630964691366
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642507,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642507
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546837,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546837
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.01637696614261008,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.01637696614261008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.0279715413701706,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.0279715413701706
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.6199218454707466,
"mc2_stderr": 0.015289736195923467
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386783
},
"harness|gsm8k|5": {
"acc": 0.6103108415466262,
"acc_stderr": 0.01343312323611072
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.