datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Abdirahman555/SomGPT | ---
license: mit
task_categories:
- text-generation
language:
- so
tags:
- somali
- NLP
pretty_name: SomGPT
size_categories:
- 10K<n<100K
--- |
MattewJWLiu/segformer_anime_poc | ---
dataset_info:
features:
- name: image
dtype: image
- name: annotation
dtype: image
splits:
- name: train
num_bytes: 3845790650.0
num_examples: 50000
- name: validation
num_bytes: 77211372.0
num_examples: 1000
- name: test
num_bytes: 76563652.0
num_examples: 1000
download_size: 3998320571
dataset_size: 3999565674.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
argilla/end2end_textclassification | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for end2end_textclassification
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/end2end_textclassification")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/end2end_textclassification")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| label | Label | label_selection | True | Classify the text by selecting the correct label from the given list of labels. | ['World', 'Sports', 'Business', 'Sci/Tech'] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": "record-0",
"fields": {
"text": "Wall St. Bears Claw Back Into the Black (Reuters) Reuters - Short-sellers, Wall Street\u0027s dwindling\\band of ultra-cynics, are seeing green again."
},
"metadata": {},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": "record-0",
"label": [],
"label-suggestion": null,
"label-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"metadata": "{}",
"text": "Wall St. Bears Claw Back Into the Black (Reuters) Reuters - Short-sellers, Wall Street\u0027s dwindling\\band of ultra-cynics, are seeing green again."
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **label** is of type `label_selection` with the following allowed values ['World', 'Sports', 'Business', 'Sci/Tech'], and description "Classify the text by selecting the correct label from the given list of labels.".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **label-suggestion** is of type `label_selection` with the following allowed values ['World', 'Sports', 'Business', 'Sci/Tech'].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Classify the articles into one of the four categories.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
davanstrien/test_card | ---
dataset_info:
features:
- name: id
dtype: string
- name: lastModified
dtype: string
- name: tags
sequence: string
- name: author
dtype: string
- name: description
dtype: string
- name: citation
dtype: string
- name: cardData
dtype: 'null'
- name: likes
dtype: int64
- name: downloads
dtype: int64
- name: card
dtype: string
splits:
- name: train
num_bytes: 203107730
num_examples: 69309
download_size: 52854496
dataset_size: 203107730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_card"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andreotte/multi-label-classification-test-small | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
0: Door
1: Eaves
2: Gutter
3: Vegetation
4: Vent
5: Window
- name: pixel_values
dtype: image
splits:
- name: test
num_bytes: 1579714.0
num_examples: 25
- name: train
num_bytes: 3593924.0
num_examples: 59
download_size: 5175857
dataset_size: 5173638.0
---
# Dataset Card for "multi-label-classification-test-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_us_history-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 17615
num_examples: 5
download_size: 0
dataset_size: 17615
---
# Dataset Card for "mmlu-high_school_us_history-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangp/Tanis | ---
license: apache-2.0
---
|
ZhenbinWang/brain | ---
license: unknown
---
|
Marchanjo/spider-FIT-es | ---
license: cc-by-sa-4.0
---
Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) |
Codec-SUPERB/vox_lingua_extract_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 29050426
num_examples: 972
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 29050426
num_examples: 972
- name: academicodec_hifi_24k_320d
num_bytes: 43544890
num_examples: 972
- name: audiodec_24k_320d
num_bytes: 92891386
num_examples: 972
- name: dac_16k
num_bytes: 177758650
num_examples: 972
- name: dac_24k
num_bytes: 499327354
num_examples: 972
- name: dac_44k
num_bytes: 146207530
num_examples: 972
- name: encodec_24k
num_bytes: 21810970
num_examples: 972
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 232330618
num_examples: 972
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 232330618
num_examples: 972
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 232330618
num_examples: 972
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 116374906
num_examples: 972
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 232330618
num_examples: 972
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 232330618
num_examples: 972
- name: speech_tokenizer_16k
num_bytes: 58117114
num_examples: 972
download_size: 322284705
dataset_size: 2375786742
---
# Dataset Card for "vox_lingua_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mireiaplalis/processed_cadec_no_prefix | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': 0-ADR
'2': 0-Drug
'3': 0-Disease
'4': 0-Symptom
'5': 0-Finding
- name: info
sequence: string
splits:
- name: train
num_bytes: 2118471.2
num_examples: 1000
- name: test
num_bytes: 264808.9
num_examples: 125
- name: validation
num_bytes: 264808.9
num_examples: 125
download_size: 442353
dataset_size: 2648089.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_InnerI__InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp | ---
pretty_name: Evaluation run of InnerI/InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [InnerI/InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp](https://huggingface.co/InnerI/InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_InnerI__InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T18:38:53.137384](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp/blob/main/results_2024-03-09T18-38-53.137384.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511782363779303,\n\
\ \"acc_stderr\": 0.03198321204026635,\n \"acc_norm\": 0.6530684920502713,\n\
\ \"acc_norm_stderr\": 0.03262648468991683,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5351031185202262,\n\
\ \"mc2_stderr\": 0.014961733868018287\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407163,\n\
\ \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177278\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6571400119498108,\n\
\ \"acc_stderr\": 0.004736950810617791,\n \"acc_norm\": 0.8521210914160526,\n\
\ \"acc_norm_stderr\": 0.0035425443194051424\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318994,\n \
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640763,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640763\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179323,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179323\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546835,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\
\ \"acc_stderr\": 0.0162690886639594,\n \"acc_norm\": 0.3843575418994413,\n\
\ \"acc_norm_stderr\": 0.0162690886639594\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.012731102790504514,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.012731102790504514\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5351031185202262,\n\
\ \"mc2_stderr\": 0.014961733868018287\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8058405682715075,\n \"acc_stderr\": 0.011116983392392657\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6141015921152388,\n \
\ \"acc_stderr\": 0.013409077471319168\n }\n}\n```"
repo_url: https://huggingface.co/InnerI/InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|arc:challenge|25_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|gsm8k|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hellaswag|10_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-38-53.137384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T18-38-53.137384.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- '**/details_harness|winogrande|5_2024-03-09T18-38-53.137384.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T18-38-53.137384.parquet'
- config_name: results
data_files:
- split: 2024_03_09T18_38_53.137384
path:
- results_2024-03-09T18-38-53.137384.parquet
- split: latest
path:
- results_2024-03-09T18-38-53.137384.parquet
---
# Dataset Card for Evaluation run of InnerI/InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [InnerI/InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp](https://huggingface.co/InnerI/InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_InnerI__InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T18:38:53.137384](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerILLM-OpenPipe-Nous-Yarn-Mistral-optimized-1228-7B-slerp/blob/main/results_2024-03-09T18-38-53.137384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511782363779303,
"acc_stderr": 0.03198321204026635,
"acc_norm": 0.6530684920502713,
"acc_norm_stderr": 0.03262648468991683,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5351031185202262,
"mc2_stderr": 0.014961733868018287
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407163,
"acc_norm": 0.6578498293515358,
"acc_norm_stderr": 0.013864152159177278
},
"harness|hellaswag|10": {
"acc": 0.6571400119498108,
"acc_stderr": 0.004736950810617791,
"acc_norm": 0.8521210914160526,
"acc_norm_stderr": 0.0035425443194051424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.02355964698318994,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.02355964698318994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640763,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640763
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179323,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179323
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.0162690886639594,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.0162690886639594
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504514,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504514
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5351031185202262,
"mc2_stderr": 0.014961733868018287
},
"harness|winogrande|5": {
"acc": 0.8058405682715075,
"acc_stderr": 0.011116983392392657
},
"harness|gsm8k|5": {
"acc": 0.6141015921152388,
"acc_stderr": 0.013409077471319168
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_202 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1181863384.0
num_examples: 232102
download_size: 1207428704
dataset_size: 1181863384.0
---
# Dataset Card for "chunk_202"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sachin7/chatbot | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 868534.4067526252
num_examples: 5266
- name: test
num_bytes: 372252.5932473747
num_examples: 2257
download_size: 366550
dataset_size: 1240787.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Nexdata/English_Intention_Annotation_Data_in_Interactive_Scenes | ---
task_categories:
- text-classification
language:
- en
---
# Dataset Card for Nexdata/English_Intention_Annotation_Data_in_Interactive_Scenes
## Description
84,516 Sentences - English Intention Annotation Data in Interactive Scenes, annotated with intent classes, including slot and slot value information; the intent field includes music, weather, date, schedule, home equipment, etc.; it is applied to intent recognition research and related fields.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1154?source=Huggingface
# Specifications
## Content
intent-type single sentence annotation data
## ContentLabel Content
Manually write sentences with corresponding intentions and make intent annotations
## Storage Format
Excel
## Language
English
## Data Size
84,516 Sentences
## Application scenario
intention understanding in speech interaction
# Licensing Information
Commercial License |
In2Training/VaLProbing-32K | ---
license: mit
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: set_id
dtype: int64
- name: position_id
dtype: int64
- name: label
dtype: string
- name: description
dtype: string
splits:
- name: document_bi_32k
num_bytes: 414417632
num_examples: 3200
- name: code_backward_32k
num_bytes: 247574073
num_examples: 3200
- name: database_forward_32k
num_bytes: 268852430
num_examples: 3000
download_size: 515780530
dataset_size: 930844135
configs:
- config_name: default
data_files:
- split: document_bi_32k
path: data/document_bi_32k-*
- split: code_backward_32k
path: data/code_backward_32k-*
- split: database_forward_32k
path: data/database_forward_32k-*
---
For the usage of VaLProbing-32K, see the guidance in [https://github.com/microsoft/FILM/](https://github.com/microsoft/FILM/). |
jquigl/imdb-genres | ---
license: cc-by-nc-sa-4.0
language:
- en
---
# Dataset Card for IMDb Movie Dataset: All Movies by Genre
## Dataset Description
- **Homepage:** https://www.kaggle.com/datasets/rajugc/imdb-movies-dataset-based-on-genre?select=history.csv
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is an adapted version of **"IMDb Movie Dataset: All Movies by Genre"** found at: https://www.kaggle.com/datasets/rajugc/imdb-movies-dataset-based-on-genre?select=history.csv.
Within the dataset, the movie title and year columns were combined, the genre was extracted from the seperate csv files, the pre-existing genre column was renamed to expanded-genres, any movies missing a description (i.e. the description was written as "Add a plot / Plot unkown" or similar) were dropped from the original data, the rating column was left the same, and finally the rest of the remaining columns were dropped.
The columns in the data are: "movie title - year", "genre", "expanded-genres", "rating", and "description"
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Language data is English.
## Dataset Structure
### Data Instances
**training** = 238256 entries
- Total amount of disk used: 54.3 MB
**test** = 29756 entries
- Total amount of disk used: 6.77 MB
**validation** = 29809 entries
- Total amount of disk used: 6.78 MB
An example of one entry looks as follows:
```
{
"movie title - year" : "Die Hard - 1988",
"genre" : "Action",
"expanded-genres" : "Action, Thriller",
"rating" : 8.2,
"description" : "A New York City police officer tries to save his estranged wife and several others taken hostage by terrorists during a Christmas party at the Nakatomi Plaza in Los Angeles."
}
```
### Data Fields
The data fields are the same among all splits.
Fields are as follows:
- "movie title - year": a string feature.
- "genre": a string classification label, with the possible values: 'Adventure', 'Action', 'Thriller', 'Romance', 'Crime', 'Fantasy', 'Mystery', 'Horror', 'War', 'Family', 'Animation', 'Scifi', 'Sports', 'History', 'Biography', and 'Film-noir'.
- "expanded-genres" : a string feature.
- "rating" : a floating point value, ranging from 0.0 to 10.0.
- "description" : a string feature.
### Source Data
**"IMDb Movie Dataset: All Movies by Genre"** found at: https://www.kaggle.com/datasets/rajugc/imdb-movies-dataset-based-on-genre?select=history.csv
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
result-kand2-sdxl-wuerst-karlo/53decd51 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 190
num_examples: 10
download_size: 1351
dataset_size: 190
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "53decd51"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/OK-VQA_test_google_flan_t5_xxl_mode_A_CM_D_PNP_GENERIC_Q_rices_ns_5046 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 58427944
num_examples: 5046
download_size: 10440899
dataset_size: 58427944
---
# Dataset Card for "OK-VQA_test_google_flan_t5_xxl_mode_A_CM_D_PNP_GENERIC_Q_rices_ns_5046"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
grammarly/detexd-benchmark | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
size_categories:
- 1K<n<10K
pretty_name: 'DeTexD: A Benchmark Dataset for Delicate Text Detection'
dataset_info:
features:
- name: text
dtype: string
- name: annotator_1
dtype: int32
- name: annotator_2
dtype: int32
- name: annotator_3
dtype: int32
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
splits:
- name: test
num_examples: 1023
---
# Dataset Card for DeTexD: A Benchmark Dataset for Delicate Text Detection
## Dataset Description
- **Repository:** [DeTexD repository](https://github.com/grammarly/detexd)
- **Paper:** [DeTexD: A Benchmark Dataset for Delicate Text Detection](TODO)
### Dataset Summary
We define *delicate text* as any text that is emotionally charged or potentially triggering such that engaging with it has the potential to result in harm. This broad term covers a range of sensitive texts that vary across four major dimensions: 1) riskiness, 2) explicitness, 3) topic, and 4) target.
This dataset contains texts with fine-grained individual annotator labels from 0 to 5 (where 0 indicates no risk and 5 indicates high risk) and averaged binary labels. See paper for more details.
**Repository:** [DeTexD repository](https://github.com/grammarly/detexd) <br>
**Paper:** [DeTexD: A Benchmark Dataset for Delicate Text Detection](TODO)
## Dataset Structure
### Data Instances
```
{'text': '"He asked me and the club if we could give him a couple of days off just to clear up his mind and he will be back in the group, I suppose, next Monday, back for training and then be a regular part of the whole squad again," Rangnick said.',
'annotator_1': 0,
'annotator_2': 0,
'annotator_3': 0,
'label': 0}
```
### Data Fields
- `text`: Text to be classified
- `annotator_1`: Annotator 1 score (0-5)
- `annotator_2`: Annotator 2 score (0-5)
- `annotator_3`: Annotator 3 score (0-5)
- `label`: Averaged binary score (>=3), either "negative" (0) or positive (1)
### Data Splits
| | test |
|--------------------|-----:|
| Number of examples | 1023 |
### Citation Information
```
@inproceedings{chernodub-etal-2023-detexd,
title = "{D}e{T}ex{D}: A Benchmark Dataset for Delicate Text Detection",
author = "Yavnyi, Serhii and Sliusarenko, Oleksii and Razzaghi, Jade and Mo, Yichen and Hovakimyan, Knar and Chernodub, Artem",
booktitle = "The 7th Workshop on Online Abuse and Harms (WOAH)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.woah-1.2",
pages = "14--28",
abstract = "Over the past few years, much research has been conducted to identify and regulate toxic language. However, few studies have addressed a broader range of sensitive texts that are not necessarily overtly toxic. In this paper, we introduce and define a new category of sensitive text called {``}delicate text.{''} We provide the taxonomy of delicate text and present a detailed annotation scheme. We annotate DeTexD, the first benchmark dataset for delicate text detection. The significance of the difference in the definitions is highlighted by the relative performance deltas between models trained each definitions and corpora and evaluated on the other. We make publicly available the DeTexD Benchmark dataset, annotation guidelines, and baseline model for delicate text detection.",
}
``` |
arminmrm93/free_recipe_with_embed | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
- name: embeddings
sequence: float64
splits:
- name: train
num_bytes: 14679976
num_examples: 2082
download_size: 0
dataset_size: 14679976
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "free_recipe_with_embed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
intanm/webis-clickbait-spoiling-seq-tag | ---
license: apache-2.0
---
|
climatebert/climate_specificity | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license: cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids: []
pretty_name: ClimateSpecificity
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': non-specific
'1': specific
splits:
- name: train
num_bytes: 492077
num_examples: 1000
- name: test
num_bytes: 174265
num_examples: 320
download_size: 373454
dataset_size: 666342
---
# Dataset Card for climate_specificity
## Dataset Description
- **Homepage:** [climatebert.ai](https://climatebert.ai)
- **Repository:**
- **Paper:** [papers.ssrn.com/sol3/papers.cfm?abstract_id=3998435](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3998435)
- **Leaderboard:**
- **Point of Contact:** [Nicolas Webersinke](mailto:nicolas.webersinke@fau.de)
### Dataset Summary
We introduce an expert-annotated dataset for classifying the climate-related specificity of climate-related paragraphs in corporate disclosures.
### Supported Tasks and Leaderboards
The dataset supports a binary classification task of whether a given climate-related paragraph is specific or not.
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
```
{
'text': '− Scope 3: Optional scope that includes indirect emissions associated with the goods and services supply chain produced outside the organization. Included are emissions from the transport of products from our logistics centres to stores (downstream) performed by external logistics operators (air, land and sea transport) as well as the emissions associated with electricity consumption in franchise stores.',
'label': 1
}
```
### Data Fields
- text: a climate-related paragraph extracted from corporate annual reports and sustainability reports
- label: the label (0 -> non-specific, 1 -> specific)
### Data Splits
The dataset is split into:
- train: 1,000
- test: 320
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Our dataset contains climate-related paragraphs extracted from financial disclosures by firms. We collect text from corporate annual reports and sustainability reports.
For more information regarding our sample selection, please refer to the Appendix of our paper (see [citation](#citation-information)).
#### Who are the source language producers?
Mainly large listed companies.
### Annotations
#### Annotation process
For more information on our annotation process and annotation guidelines, please refer to the Appendix of our paper (see [citation](#citation-information)).
#### Who are the annotators?
The authors and students at Universität Zürich and Friedrich-Alexander-Universität Erlangen-Nürnberg with majors in finance and sustainable finance.
### Personal and Sensitive Information
Since our text sources contain public information, no personal and sensitive information should be included.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
- Julia Anna Bingler
- Mathias Kraus
- Markus Leippold
- Nicolas Webersinke
### Licensing Information
This dataset is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license (cc-by-nc-sa-4.0). To view a copy of this license, visit [creativecommons.org/licenses/by-nc-sa/4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/).
If you are interested in commercial use of the dataset, please contact [markus.leippold@bf.uzh.ch](mailto:markus.leippold@bf.uzh.ch).
### Citation Information
```bibtex
@techreport{bingler2023cheaptalk,
title={How Cheap Talk in Climate Disclosures Relates to Climate Initiatives, Corporate Emissions, and Reputation Risk},
author={Bingler, Julia and Kraus, Mathias and Leippold, Markus and Webersinke, Nicolas},
type={Working paper},
institution={Available at SSRN 3998435},
year={2023}
}
```
### Contributions
Thanks to [@webersni](https://github.com/webersni) for adding this dataset. |
arize-ai/human_actions_quality_drift | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
pretty_name: sentiment-classification-reviews-with-drift
size_categories:
- 10K<n<100K
source_datasets:
- extended|imdb
task_categories:
- image-classification
task_ids:
- multi-class-classification
---
# Dataset Card for `reviews_with_drift`
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
This dataset was crafted to be used in our tutorial [Link to the tutorial when ready]. It consists on a large Movie Review Dataset mixed with some reviews from a Hotel Review Dataset. The training/validation set are purely obtained from the Movie Review Dataset while the production set is mixed. Some other features have been added (`age`, `gender`, `context`) as well as a made up timestamp `prediction_ts` of when the inference took place.
### Supported Tasks and Leaderboards
`text-classification`, `sentiment-classification`: The dataset is mainly used for text classification: given the text, predict the sentiment (positive or negative).
### Languages
Text is mainly written in english.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@fjcasti1](https://github.com/fjcasti1) for adding this dataset. |
CorpuSlave/conversation | ---
license: cc-by-nc-sa-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: doc_id
dtype: string
splits:
- name: train
num_bytes: 727147995
num_examples: 1175803
download_size: 334681572
dataset_size: 727147995
---
|
Erynan/1k_deon_util | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1499818
num_examples: 2000
download_size: 259937
dataset_size: 1499818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_indef_one | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 914496
num_examples: 3903
- name: dev_mismatched
num_bytes: 963128
num_examples: 4028
- name: test_matched
num_bytes: 905382
num_examples: 3839
- name: test_mismatched
num_bytes: 940848
num_examples: 3895
- name: train
num_bytes: 36975946
num_examples: 157024
download_size: 26357395
dataset_size: 40699800
---
# Dataset Card for "MULTI_VALUE_mnli_indef_one"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_maywell__Synatra-RP-Orca-2-7b-v0.1 | ---
pretty_name: Evaluation run of maywell/Synatra-RP-Orca-2-7b-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/Synatra-RP-Orca-2-7b-v0.1](https://huggingface.co/maywell/Synatra-RP-Orca-2-7b-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__Synatra-RP-Orca-2-7b-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-07T22:12:28.167170](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-RP-Orca-2-7b-v0.1/blob/main/results_2024-01-07T22-12-28.167170.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5617788388118229,\n\
\ \"acc_stderr\": 0.03367703048955596,\n \"acc_norm\": 0.565018654271738,\n\
\ \"acc_norm_stderr\": 0.034368770251827,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5254804392398661,\n\
\ \"mc2_stderr\": 0.015769818879652533\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097665,\n\
\ \"acc_norm\": 0.5742320819112628,\n \"acc_norm_stderr\": 0.014449464278868809\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5892252539334794,\n\
\ \"acc_stderr\": 0.004909689876342047,\n \"acc_norm\": 0.7730531766580363,\n\
\ \"acc_norm_stderr\": 0.004180018992862967\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926603,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842507,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842507\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6096774193548387,\n\
\ \"acc_stderr\": 0.027751256636969576,\n \"acc_norm\": 0.6096774193548387,\n\
\ \"acc_norm_stderr\": 0.027751256636969576\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164528,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448666,\n\
\ \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448666\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394849,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394849\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547832,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547832\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.033509916046960436,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.033509916046960436\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101077,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101077\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277906,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277906\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475361,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475361\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125145,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125145\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3970013037809648,\n\
\ \"acc_stderr\": 0.012496346982909556,\n \"acc_norm\": 0.3970013037809648,\n\
\ \"acc_norm_stderr\": 0.012496346982909556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181354,\n \
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181354\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440313,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440313\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.034104105654953025,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.034104105654953025\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5254804392398661,\n\
\ \"mc2_stderr\": 0.015769818879652533\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440474\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3949962092494314,\n \
\ \"acc_stderr\": 0.01346535496997321\n }\n}\n```"
repo_url: https://huggingface.co/maywell/Synatra-RP-Orca-2-7b-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|arc:challenge|25_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_03T17_42_00.048220
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-42-00.048220.parquet'
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|gsm8k|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hellaswag|10_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T22-12-28.167170.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T22-12-28.167170.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_07T22_12_28.167170
path:
- '**/details_harness|winogrande|5_2024-01-07T22-12-28.167170.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-07T22-12-28.167170.parquet'
- config_name: results
data_files:
- split: 2023_12_03T17_42_00.048220
path:
- results_2023-12-03T17-42-00.048220.parquet
- split: 2024_01_07T22_12_28.167170
path:
- results_2024-01-07T22-12-28.167170.parquet
- split: latest
path:
- results_2024-01-07T22-12-28.167170.parquet
---
# Dataset Card for Evaluation run of maywell/Synatra-RP-Orca-2-7b-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/Synatra-RP-Orca-2-7b-v0.1](https://huggingface.co/maywell/Synatra-RP-Orca-2-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__Synatra-RP-Orca-2-7b-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T22:12:28.167170](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-RP-Orca-2-7b-v0.1/blob/main/results_2024-01-07T22-12-28.167170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5617788388118229,
"acc_stderr": 0.03367703048955596,
"acc_norm": 0.565018654271738,
"acc_norm_stderr": 0.034368770251827,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5254804392398661,
"mc2_stderr": 0.015769818879652533
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097665,
"acc_norm": 0.5742320819112628,
"acc_norm_stderr": 0.014449464278868809
},
"harness|hellaswag|10": {
"acc": 0.5892252539334794,
"acc_stderr": 0.004909689876342047,
"acc_norm": 0.7730531766580363,
"acc_norm_stderr": 0.004180018992862967
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926603,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842507,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842507
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6096774193548387,
"acc_stderr": 0.027751256636969576,
"acc_norm": 0.6096774193548387,
"acc_norm_stderr": 0.027751256636969576
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164528,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.025317649726448666,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.025317649726448666
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394849,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394849
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547832,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547832
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101077,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101077
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277906,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277906
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475361,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475361
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125145,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125145
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3970013037809648,
"acc_stderr": 0.012496346982909556,
"acc_norm": 0.3970013037809648,
"acc_norm_stderr": 0.012496346982909556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.020109864547181354,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.020109864547181354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440313,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440313
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.034104105654953025,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.034104105654953025
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5254804392398661,
"mc2_stderr": 0.015769818879652533
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440474
},
"harness|gsm8k|5": {
"acc": 0.3949962092494314,
"acc_stderr": 0.01346535496997321
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lucyd/fcc-evalset | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 19934
num_examples: 27
download_size: 12169
dataset_size: 19934
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/shirayuki_chiyo_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shirayuki_chiyo/白雪千夜 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of shirayuki_chiyo/白雪千夜 (THE iDOLM@STER: Cinderella Girls), containing 323 images and their tags.
The core tags of this character are `black_hair, short_hair, bangs, purple_eyes, blunt_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 323 | 392.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_chiyo_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 323 | 225.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_chiyo_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 741 | 469.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_chiyo_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 323 | 349.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_chiyo_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 741 | 680.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_chiyo_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shirayuki_chiyo_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, black_gloves, black_serafuku, black_skirt, long_sleeves, pleated_skirt, red_neckerchief, solo, looking_at_viewer, black_shirt, simple_background, white_sailor_collar, bob_cut, closed_mouth, white_background |
| 1 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, red_neckerchief, solo, upper_body, long_sleeves, simple_background, white_sailor_collar, black_gloves, black_shirt, closed_mouth, white_background, black_serafuku, blush, bob_cut |
| 2 | 5 |  |  |  |  |  | 1girl, black_serafuku, black_shirt, bob_cut, looking_at_viewer, red_neckerchief, solo, upper_body, white_sailor_collar, parted_lips, shaded_face, disgust, simple_background, white_background |
| 3 | 6 |  |  |  |  |  | 1girl, black_bikini, collarbone, solo, bare_shoulders, blush, looking_at_viewer, navel, small_breasts, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, black_dress, black_gloves, elbow_gloves, looking_at_viewer, solo, bare_shoulders, bracelet, frills, bob_cut, bowtie, flower, hair_ornament, parted_lips, ribbon, simple_background, sleeveless_dress, upper_body |
| 5 | 9 |  |  |  |  |  | looking_at_viewer, maid_headdress, 1girl, maid_apron, solo, black_gloves, bowtie, brooch, juliet_sleeves, black_dress, blush, grey_background, blue_bow, enmaided, simple_background, white_apron |
| 6 | 5 |  |  |  |  |  | ascot, black_gloves, feather_hair_ornament, holding_sword, looking_at_viewer, 1girl, belt, closed_mouth, long_sleeves, solo, white_background, blush, brooch, sheathed, simple_background, smile, black_thighhighs, blue_flower, blue_ribbon, bob_cut, cape, cowboy_shot, petals, scabbard, skirt, white_jacket |
| 7 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, vaginal, breasts, penis, serafuku, sex, solo_focus, pussy, black_gloves, missionary, mosaic_censoring, nipples, on_back, open_mouth, pubic_hair, bar_censor, navel, pantyhose, spread_legs, torn_clothes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | black_serafuku | black_skirt | long_sleeves | pleated_skirt | red_neckerchief | solo | looking_at_viewer | black_shirt | simple_background | white_sailor_collar | bob_cut | closed_mouth | white_background | upper_body | blush | parted_lips | shaded_face | disgust | black_bikini | collarbone | bare_shoulders | navel | small_breasts | black_dress | elbow_gloves | bracelet | frills | bowtie | flower | hair_ornament | ribbon | sleeveless_dress | maid_headdress | maid_apron | brooch | juliet_sleeves | grey_background | blue_bow | enmaided | white_apron | ascot | feather_hair_ornament | holding_sword | belt | sheathed | smile | black_thighhighs | blue_flower | blue_ribbon | cape | cowboy_shot | petals | scabbard | skirt | white_jacket | 1boy | hetero | vaginal | breasts | penis | serafuku | sex | solo_focus | pussy | missionary | mosaic_censoring | nipples | on_back | open_mouth | pubic_hair | bar_censor | pantyhose | spread_legs | torn_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------------|:--------------|:---------------|:----------------|:------------------|:-------|:--------------------|:--------------|:--------------------|:----------------------|:----------|:---------------|:-------------------|:-------------|:--------|:--------------|:--------------|:----------|:---------------|:-------------|:-----------------|:--------|:----------------|:--------------|:---------------|:-----------|:---------|:---------|:---------|:----------------|:---------|:-------------------|:-----------------|:-------------|:---------|:-----------------|:------------------|:-----------|:-----------|:--------------|:--------|:------------------------|:----------------|:-------|:-----------|:--------|:-------------------|:--------------|:--------------|:-------|:--------------|:---------|:-----------|:--------|:---------------|:-------|:---------|:----------|:----------|:--------|:-----------|:------|:-------------|:--------|:-------------|:-------------------|:----------|:----------|:-------------|:-------------|:-------------|:------------|:--------------|:---------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | | | | X | X | X | X | X | X | X | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | | | | X | X | | | | | | X | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | | | | | X | X | | X | | X | | | X | | X | | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | X | | | | | | X | X | | X | | | | | | X | | | | | | | | | X | | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | X | | | X | X | | X | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | X | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
blastwind/github-code-scala | ---
dataset_info:
features:
- name: code
dtype: string
- name: repo_name
dtype: string
- name: path
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: size
dtype: int64
splits:
- name: train
num_bytes: 3330521484.4803743
num_examples: 654001
- name: valid
num_bytes: 416314548.9934581
num_examples: 81750
- name: test
num_bytes: 416319641.5261675
num_examples: 81751
download_size: 1534670727
dataset_size: 4163155675.0
task_categories:
- text-generation
size_categories:
- 100K<n<1M
---
# Dataset Card for "github-code-scala"
This contains just the scala data in [github-code-clean](https://huggingface.co/datasets/codeparrot/github-code). There are 817k samples with a total download size of 1.52GB. |
conglu/vd4rl | ---
license: mit
thumbnail: "https://github.com/conglu1997/v-d4rl/raw/main/figs/envs.png"
tags:
- Reinforcement Learning
- Offline Reinforcement Learning
- Reinforcement Learning from Pixels
- DreamerV2
- DrQ+BC
datasets:
- V-D4RL
---
# V-D4RL
V-D4RL provides pixel-based analogues of the popular D4RL benchmarking tasks, derived from the **`dm_control`** suite, along with natural extensions of two state-of-the-art online pixel-based continuous control algorithms, DrQ-v2 and DreamerV2, to the offline setting. For further details, please see the paper:
**_Challenges and Opportunities in Offline Reinforcement Learning from Visual Observations_**; Cong Lu*, Philip J. Ball*, Tim G. J. Rudner, Jack Parker-Holder, Michael A. Osborne, Yee Whye Teh.
<p align="center">
<a href=https://arxiv.org/abs/2206.04779>View on arXiv</a>
</p>
## Benchmarks
The V-D4RL datasets can be found in this repository under `vd4rl`. **These must be downloaded before running the code.** Assuming the data is stored under `vd4rl_data`, the file structure is:
```
vd4rl_data
└───main
│ └───walker_walk
│ │ └───random
│ │ │ └───64px
│ │ │ └───84px
│ │ └───medium_replay
│ │ │ ...
│ └───cheetah_run
│ │ ...
│ └───humanoid_walk
│ │ ...
└───distracting
│ ...
└───multitask
│ ...
```
## Baselines
### Environment Setup
Requirements are presented in conda environment files named `conda_env.yml` within each folder. The command to create the environment is:
```
conda env create -f conda_env.yml
```
Alternatively, dockerfiles are located under `dockerfiles`, replace `<<USER_ID>>` in the files with your own user ID from the command `id -u`.
### V-D4RL Main Evaluation
Example run commands are given below, given an environment type and dataset identifier:
```
ENVNAME=walker_walk # choice in ['walker_walk', 'cheetah_run', 'humanoid_walk']
TYPE=random # choice in ['random', 'medium_replay', 'medium', 'medium_expert', 'expert']
```
#### Offline DV2
```
python offlinedv2/train_offline.py --configs dmc_vision --task dmc_${ENVNAME} --offline_dir vd4rl_data/main/${ENV_NAME}/${TYPE}/64px --offline_penalty_type meandis --offline_lmbd_cons 10 --seed 0
```
#### DrQ+BC
```
python drqbc/train.py task_name=offline_${ENVNAME}_${TYPE} offline_dir=vd4rl_data/main/${ENV_NAME}/${TYPE}/84px nstep=3 seed=0
```
#### DrQ+CQL
```
python drqbc/train.py task_name=offline_${ENVNAME}_${TYPE} offline_dir=vd4rl_data/main/${ENV_NAME}/${TYPE}/84px algo=cql cql_importance_sample=false min_q_weight=10 seed=0
```
#### BC
```
python drqbc/train.py task_name=offline_${ENVNAME}_${TYPE} offline_dir=vd4rl_data/main/${ENV_NAME}/${TYPE}/84px algo=bc seed=0
```
### Distracted and Multitask Experiments
To run the distracted and multitask experiments, it suffices to change the offline directory passed to the commands above.
## Note on data collection and format
We follow the image sizes and dataset format of each algorithm's native codebase.
The means that Offline DV2 uses `*.npz` files with 64px images to store the offline data, whereas DrQ+BC uses `*.hdf5` with 84px images.
The data collection procedure is detailed in Appendix B of our paper, and we provide conversion scripts in `conversion_scripts`.
For the original SAC policies to generate the data see [here](https://github.com/philipjball/SAC_PyTorch/blob/dmc_branch/train_agent.py).
See [here](https://github.com/philipjball/SAC_PyTorch/blob/dmc_branch/gather_offline_data.py) for distracted/multitask variants.
We used `seed=0` for all data generation.
## Acknowledgements
V-D4RL builds upon many works and open-source codebases in both offline reinforcement learning and online pixel-based continuous control. We would like to particularly thank the authors of:
- [D4RL](https://github.com/rail-berkeley/d4rl)
- [DMControl](https://github.com/deepmind/dm_control)
- [DreamerV2](https://github.com/danijar/dreamerv2)
- [DrQ-v2](https://github.com/facebookresearch/drqv2)
- [LOMPO](https://github.com/rmrafailov/LOMPO)
## Contact
Please contact [Cong Lu](mailto:cong.lu@stats.ox.ac.uk) or [Philip Ball](mailto:ball@robots.ox.ac.uk) for any queries. We welcome any suggestions or contributions!
|
InstaDeepAI/plant-genomic-benchmark | ---
tags:
- DNA
- Genomics
- Plants
pretty_name: Plant Genomic Benchmark
license: cc-by-nc-sa-4.0
---
## Dataset Overview
This dataset features the 8 evaluation tasks presented in the AgroNT (A Foundational Large Language Model for Edible Plant
Genomes) paper. The tasks cover single output regression, multi output regression, binary classification, and multi-label classification which
aim to provide a comprehensive plant genomics benchmark. Additionally, we provide results from in silico saturation mutagenesis analysis of sequences
from the cassava genome, assessing the impact of >10 million mutations on gene expression levels and enhancer elements. See the ISM section
below for details regarding the data from this analysis.
| Name | # of Datasets(Species) | Task Type | Sequence Length (base pair) |
| -------- | ------- | -------- | ------- |
| Polyadenylation | 6 | Binary Classification | 400 |
| Splice Site | 2 | Binary Classification | 398 |
| LncRNA | 6 | Binary Classification | 101-6000 |
| Promoter Strength | 2 | Single Variable Regression | 170 |
| Terminator Strength | 2 | Single Variable Regression | 170 |
| Chromatin Accessibility | 7 | Multi-label Classification | 1000 |
| Gene Expression | 6 | Multi-Variable Regression | 6000 |
| Enhancer Region | 1 | Binary Classification | 1000 |
## Dataset Sizes
| Task Name | # Train Samples | # Validation Samples | # Test Samples |
| -------- | ------- | -------- | ------- |
|poly_a.arabidopsis_thaliana|170835|---|30384|
|poly_a.oryza_sativa_indica_group|98139|---|16776|
|poly_a.trifolium_pratense|111138|---|13746|
|poly_a.medicago_truncatula|47277|---|8850|
|poly_a.chlamydomonas_reinhardtii|90378|---|10542|
|poly_a.oryza_sativa_japonica_group|120621|---|20232|
|splicing.arabidopsis_thaliana_donor|2588034|---|377873|
|splicing.arabidopsis_thaliana_acceptor|1704844|---|250084|
|lncrna.m_esculenta|4934|---|360|
|lncrna.z_mays|8423|---|1629|
|lncrna.g_max|11430|---|490|
|lncrna.s_lycopersicum|7274|---|1072|
|lncrna.t_aestivum|11252|---|1810|
|lncrna.s_bicolor|8654|---|734|
|promoter_strength.leaf|58179|6825|7154|
|promoter_strength.protoplast|61051|7162|7595|
|terminator_strength.leaf|43294|5309|4806|
|terminator_strength.protoplast|43289|5309|4811|
|gene_exp.glycine_max|47136|4803|4803|
|gene_exp.oryza_sativa|31244|3702|3702|
|gene_exp.solanum_lycopersicum|27321|3827|3827|
|gene_exp.zea_mays|34493|4483|4483|
|gene_exp.arabidopsis_thaliana|25731|3401|3402|
|chromatin_access.oryza_sativa_MH63_RS2|5120000|14848|14848|
|chromatin_access.setaria_italica|5120000|19968|19968|
|chromatin_access.oryza_sativa_ZS97_RS2|5120000|14848|14848|
|chromatin_access.arabidopis_thaliana|5120000|9984|9984|
|chromatin_access.brachypodium_distachyon|5120000|14848|14848|
|chromatin_access.sorghum_bicolor|5120000|29952|29952|
|chromatin_access.zea_mays|6400000|79872|79872|
|pro_seq.m_esculenta|16852|1229|812|
*** It is important to note that fine-tuning for lncrna was carried out using all datasets in a single training. The reason for this is that the datasets are small and combining
them helped to improve learning.
## Example Usage
```python
from datasets import load_dataset
task_name='terminator_strength.protoplast' # one of the task names from the above table
dataset = load_dataset("InstaDeepAI/plant-genomic-benchmark",task_name=task_name)
```
## In Silico Saturation Mutagensis
### File structure for: ISM_Tables/Mesculenta_305_v6_PROseq_ISM_LOG2FC.txt.gz
Intergenic enhancer regions based on Lozano et al. 2021 (https://pubmed.ncbi.nlm.nih.gov/34499719/) <br>
Genome version: Manihot esculenta reference genome v6.1 from Phytozome <br>
CHR: Chromosome <br>
POS: Physical position (bp) <br>
REF: Reference allele <br>
ALT: Alternative allele <br>
LOG2FC: Log fold change in Intergenic enhancer probability (log2(p_mutated_sequence / p_original_sequence)) <br>
### File structure for: ISM_Tables/Mesculenta_v6_GeneExpression_ISM_LOG2FC.txt.gz
Gene expression prediction based on: Wilson et al. 2016 (https://pubmed.ncbi.nlm.nih.gov/28116755/) <br>
Genome version: Manihot esculenta reference genome v6 from Ensembl 56 <br>
CHR: Chromosome <br>
POS: Physical position (bp) <br>
REF: Reference allele <br>
ALT: Alternative allele <br>
GENE: Gene ID <br>
STRAND: Gene strand <br>
TISSUE: Tissue type (Acronyms detailed in Figure 1 of Wilson et al.) <br>
LOG2FC: Gene expression log fold change (log2(gene_exp_mutated_sequence / gene_exp_original_sequence)) <br> |
hmmamalrjoub/Islam_Question_and_Answer1 | ---
task_categories:
- question-answering
language:
- ar
--- |
srmisa/elsalvador-preguntas-respuestas | ---
dataset_info:
features:
- name: texto
dtype: string
- name: pregunta
dtype: string
splits:
- name: train
num_bytes: 682
num_examples: 3
- name: eval
num_bytes: 239
num_examples: 1
download_size: 5702
dataset_size: 921
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
---
|
CyberHarem/xiao_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of xiao_genshin
This is the dataset of xiao_genshin, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 447 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 447 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 447 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 447 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
ai4bharat/samanantar | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
- as
- bn
- gu
- hi
- kn
- ml
- mr
- or
- pa
- ta
- te
license:
- cc-by-nc-4.0
multilinguality:
- translation
size_categories:
- unknown
source_datasets:
- original
task_categories:
- text-generation
- translation
task_ids: []
pretty_name: Samanantar
tags:
- conditional-text-generation
---
# Dataset Card for Samanantar
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://indicnlp.ai4bharat.org/samanantar/
- **Repository:**
- **Paper:** [Samanantar: The Largest Publicly Available Parallel Corpora Collection for 11 Indic Languages](https://arxiv.org/abs/2104.05596)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Samanantar is the largest publicly available parallel corpora collection for Indic language: Assamese, Bengali,
Gujarati, Hindi, Kannada, Malayalam, Marathi, Oriya, Punjabi, Tamil, Telugu.
The corpus has 49.6M sentence pairs between English to Indian Languages.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Samanantar contains parallel sentences between English (`en`) and 11 Indic language:
- Assamese (`as`),
- Bengali (`bn`),
- Gujarati (`gu`),
- Hindi (`hi`),
- Kannada (`kn`),
- Malayalam (`ml`),
- Marathi (`mr`),
- Odia (`or`),
- Punjabi (`pa`),
- Tamil (`ta`) and
- Telugu (`te`).
## Dataset Structure
### Data Instances
```
{
'idx': 0,
'src': 'Prime Minister Narendra Modi met Her Majesty Queen Maxima of the Kingdom of the Netherlands today.',
'tgt': 'নতুন দিল্লিতে সোমবার প্রধানমন্ত্রী শ্রী নরেন্দ্র মোদীর সঙ্গে নেদারন্যান্ডসের মহারানী ম্যাক্সিমা সাক্ষাৎ করেন।',
'data_source': 'pmi'
}
```
### Data Fields
- `idx` (int): ID.
- `src` (string): Sentence in source language (English).
- `tgt` (string): Sentence in destination language (one of the 11 Indic languages).
- `data_source` (string): Source of the data.
For created data sources, depending on the destination language, it might be one of:
- anuvaad_catchnews
- anuvaad_DD_National
- anuvaad_DD_sports
- anuvaad_drivespark
- anuvaad_dw
- anuvaad_financialexpress
- anuvaad-general_corpus
- anuvaad_goodreturns
- anuvaad_indianexpress
- anuvaad_mykhel
- anuvaad_nativeplanet
- anuvaad_newsonair
- anuvaad_nouns_dictionary
- anuvaad_ocr
- anuvaad_oneindia
- anuvaad_pib
- anuvaad_pib_archives
- anuvaad_prothomalo
- anuvaad_timesofindia
- asianetnews
- betterindia
- bridge
- business_standard
- catchnews
- coursera
- dd_national
- dd_sports
- dwnews
- drivespark
- fin_express
- goodreturns
- gu_govt
- jagran-business
- jagran-education
- jagran-sports
- ie_business
- ie_education
- ie_entertainment
- ie_general
- ie_lifestyle
- ie_news
- ie_sports
- ie_tech
- indiccorp
- jagran-entertainment
- jagran-lifestyle
- jagran-news
- jagran-tech
- khan_academy
- Kurzgesagt
- marketfeed
- mykhel
- nativeplanet
- nptel
- ocr
- oneindia
- pa_govt
- pmi
- pranabmukherjee
- sakshi
- sentinel
- thewire
- toi
- tribune
- vsauce
- wikipedia
- zeebiz
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[Creative Commons Attribution-NonCommercial 4.0 International](https://creativecommons.org/licenses/by-nc/4.0/).
### Citation Information
```
@misc{ramesh2021samanantar,
title={Samanantar: The Largest Publicly Available Parallel Corpora Collection for 11 Indic Languages},
author={Gowtham Ramesh and Sumanth Doddapaneni and Aravinth Bheemaraj and Mayank Jobanputra and Raghavan AK and Ajitesh Sharma and Sujit Sahoo and Harshita Diddee and Mahalakshmi J and Divyanshu Kakwani and Navneet Kumar and Aswin Pradeep and Srihari Nagaraj and Kumar Deepak and Vivek Raghavan and Anoop Kunchukuttan and Pratyush Kumar and Mitesh Shantadevi Khapra},
year={2021},
eprint={2104.05596},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@albertvillanova](https://github.com/albertvillanova) for adding this dataset.
|
scrapalot/Techie_Filtered_Meta | ---
dataset_info:
features:
- name: lens_id
dtype: string
- name: title
dtype: string
- name: publication_type
dtype: string
- name: year_published
dtype: float32
- name: date_published
dtype: string
- name: date_published_parts
sequence: int64
- name: created
dtype: string
- name: external_ids
list:
- name: type
dtype: string
- name: value
dtype: string
- name: open_access
struct:
- name: colour
dtype: string
- name: license
dtype: string
- name: authors
list:
- name: affiliations
list:
- name: country_code
dtype: string
- name: grid_id
dtype: string
- name: ids
list:
- name: type
dtype: string
- name: value
dtype: string
- name: name
dtype: string
- name: name_original
dtype: string
- name: collective_name
dtype: string
- name: first_name
dtype: string
- name: ids
list:
- name: type
dtype: string
- name: value
dtype: string
- name: initials
dtype: string
- name: last_name
dtype: string
- name: source
struct:
- name: asjc_codes
sequence: string
- name: asjc_subjects
sequence: string
- name: country
dtype: string
- name: issn
list:
- name: type
dtype: string
- name: value
dtype: string
- name: publisher
dtype: string
- name: title
dtype: string
- name: type
dtype: string
- name: fields_of_study
sequence: string
- name: languages
sequence: string
- name: start_page
dtype: string
- name: end_page
dtype: string
- name: author_count
dtype: float64
- name: is_open_access
dtype: bool
- name: source_urls
list:
- name: type
dtype: string
- name: url
dtype: string
- name: abstract
dtype: string
- name: references
list:
- name: lens_id
dtype: string
- name: references_count
dtype: float64
- name: scholarly_citations_count
dtype: float64
- name: scholarly_citations
sequence: string
- name: patent_citations
list:
- name: lens_id
dtype: string
- name: patent_citations_count
dtype: float64
- name: issue
dtype: string
- name: publication_supplementary_type
sequence: string
- name: volume
dtype: string
- name: conference
struct:
- name: instance
dtype: string
- name: location
dtype: string
- name: name
dtype: string
- name: mesh_terms
list:
- name: mesh_heading
dtype: string
- name: mesh_id
dtype: string
- name: qualifier_id
dtype: string
- name: qualifier_name
dtype: string
- name: chemicals
list:
- name: mesh_id
dtype: string
- name: registry_number
dtype: string
- name: substance_name
dtype: string
- name: keywords
sequence: string
- name: funding
list:
- name: country
dtype: string
- name: funding_id
dtype: string
- name: org
dtype: string
- name: clinical_trials
list:
- name: id
dtype: string
- name: registry
dtype: string
- name: pdf_urls
sequence: string
- name: domain
sequence: string
splits:
- name: train
num_bytes: 726171677.8835785
num_examples: 423860
download_size: 619298847
dataset_size: 726171677.8835785
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/odin_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of odin/オーディン/奥丁 (Azur Lane)
This is the dataset of odin/オーディン/奥丁 (Azur Lane), containing 42 images and their tags.
The core tags of this character are `multicolored_hair, white_hair, red_hair, blue_eyes, long_hair, streaked_hair, hair_over_one_eye, hat, peaked_cap, military_hat, black_headwear, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 70.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 42 | 35.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 92 | 74.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 42 | 61.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 92 | 111.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/odin_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------|
| 0 | 42 |  |  |  |  |  | 1girl, solo, looking_at_viewer, black_coat, iron_cross, breastplate, sword, open_coat, holding, sheath |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | black_coat | iron_cross | breastplate | sword | open_coat | holding | sheath |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------|:-------------|:--------------|:--------|:------------|:----------|:---------|
| 0 | 42 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
Rony71/nest | ---
license: apache-2.0
---
|
nlp-with-deeplearning/Ko.SlimOrca | ---
license: cc-by-nc-sa-4.0
task_categories:
- conversational
- text-classification
- token-classification
- table-question-answering
- question-answering
- zero-shot-classification
- summarization
- feature-extraction
- text-generation
- text2text-generation
language:
- en
- ko
size_categories:
- 100K<n<1M
---
원본 데이터셋: [Open-Orca/SlimOrca](https://huggingface.co/datasets/Open-Orca/SlimOrca) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_238 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1338951584.0
num_examples: 262952
download_size: 1364577318
dataset_size: 1338951584.0
---
# Dataset Card for "chunk_238"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_present_perfect_ever | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 645
num_examples: 3
- name: train
num_bytes: 6237
num_examples: 28
download_size: 8398
dataset_size: 6882
---
# Dataset Card for "MULTI_VALUE_wnli_present_perfect_ever"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Prashantbhatt20/llm-tolkien1 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 98352.0
num_examples: 12
- name: test
num_bytes: 32784.0
num_examples: 4
download_size: 57850
dataset_size: 131136.0
---
# Dataset Card for "llm-tolkien"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
automated-research-group/boolq | ---
dataset_info:
features:
- name: id
dtype: string
- name: request
dtype: string
- name: response
dtype: string
splits:
- name: validation
num_bytes: 2490820
num_examples: 3270
download_size: 1390879
dataset_size: 2490820
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "boolq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ashwathjadhav23/Dutch_MLM_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 53484104
num_examples: 25000
download_size: 32103697
dataset_size: 53484104
---
# Dataset Card for "Dutch_MLM_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_second_sent_train_100_eval_10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 265914
num_examples: 210
- name: validation
num_bytes: 9977
num_examples: 10
download_size: 135955
dataset_size: 275891
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_second_sent_train_100_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/light-chatml | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 32192103
num_examples: 11024
download_size: 15589538
dataset_size: 32192103
---
# Dataset Card for "light-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ai4bharat/IndicSentenceSummarization | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- as
- bn
- gu
- hi
- kn
- ml
- mr
- or
- pa
- ta
- te
license:
- cc-by-nc-4.0
multilinguality:
- multilingual
pretty_name: IndicSentenceSummarization
size_categories:
- 5K<n<112K
source_datasets:
- original for Hindi, and modified [IndicGLUE](https://indicnlp.ai4bharat.org/indic-glue/) for other languages.
task_categories:
- conditional-text-generation
task_ids:
- conditional-text-generation-other-sentence-summarization
---
# Dataset Card for "IndicSentenceSummarization"
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://indicnlp.ai4bharat.org/indicnlg-suite
- **Paper:** [IndicNLG Suite: Multilingual Datasets for Diverse NLG Tasks in Indic Languages](https://arxiv.org/abs/2203.05437)
- **Point of Contact:**
### Dataset Summary
IndicSentenceSummarization is the sentence summarization dataset released as part of IndicNLG Suite. Each
input sentence is paired with an output as summary. We create this dataset in eleven
languages including as, bn, gu, hi, kn, ml, mr, or, pa, ta, te. The total
size of the dataset is 431K.
### Supported Tasks and Leaderboards
**Tasks:** Sentence Summarization
**Leaderboards:** Currently there is no Leaderboard for this dataset.
### Languages
- `Assamese (as)`
- `Bengali (bn)`
- `Gujarati (gu)`
- `Kannada (kn)`
- `Hindi (hi)`
- `Malayalam (ml)`
- `Marathi (mr)`
- `Oriya (or)`
- `Punjabi (pa)`
- `Tamil (ta)`
- `Telugu (te)`
## Dataset Structure
### Data Instances
One random example from the `hi` dataset is given below in JSON format.
```
{'id': '5',
'input': 'जम्मू एवं कश्मीर के अनंतनाग जिले में शनिवार को सुरक्षाबलों के साथ मुठभेड़ में दो आतंकवादियों को मार गिराया गया।',
'target': 'जम्मू-कश्मीर : सुरक्षाबलों के साथ मुठभेड़ में 2 आतंकवादी ढेर',
'url': 'https://www.indiatv.in/india/national-jammu-kashmir-two-millitant-killed-in-encounter-with-security-forces-574529'
}
```
### Data Fields
- `id (string)`: Unique identifier.
- `input (string)`: Input sentence.
- `target (strings)`: Output summary.
- `url (string)`: Source web link of the sentence.
### Data Splits
Here is the number of samples in each split for all the languages.
Language | ISO 639-1 Code | Train | Dev | Test |
---------- | ---------- | ---------- | ---------- | ---------- |
Assamese | as | 10,812 | 5,232 | 5,452 |
Bengali | bn | 17,035 | 2,355 | 2,384 |
Gujarati | gu | 54,788 | 8,720 | 8,460 |
Hindi | hi | 78,876 | 16,935 | 16,835 |
Kannada | kn | 61,220 | 9,024 | 1,485 |
Malayalam | ml | 2,855 | 1,520 | 1,580 |
Marathi | mr | 27,066 | 3,249 | 3,309 |
Oriya | or | 12,065 | 1,539 | 1,440 |
Punjabi | pa | 31,630 | 4,004 | 3,967 |
Tamil | ta | 23,098 | 2,874 | 2,948 |
Telugu | te | 7,119 | 878 | 862 |
## Dataset Creation
### Curation Rationale
[Detailed in the paper](https://arxiv.org/abs/2203.05437)
### Source Data
It is a modified subset of [IndicHeadlineGeneration](https://huggingface.co/datasets/ai4bharat/IndicHeadlineGeneration) dataset.
#### Initial Data Collection and Normalization
[Detailed in the paper](https://arxiv.org/abs/2203.05437)
#### Who are the source language producers?
[Detailed in the paper](https://arxiv.org/abs/2203.05437)
### Annotations
[More information needed]
#### Annotation process
[More information needed]
#### Who are the annotators?
[More information needed]
### Personal and Sensitive Information
[More information needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More information needed]
### Discussion of Biases
[More information needed]
### Other Known Limitations
[More information needed]
## Additional Information
### Dataset Curators
[More information needed]
### Licensing Information
Contents of this repository are restricted to only non-commercial research purposes under the [Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/). Copyright of the dataset contents belongs to the original copyright holders.
### Citation Information
If you use any of the datasets, models or code modules, please cite the following paper:
```
@inproceedings{Kumar2022IndicNLGSM,
title={IndicNLG Suite: Multilingual Datasets for Diverse NLG Tasks in Indic Languages},
author={Aman Kumar and Himani Shrotriya and Prachi Sahu and Raj Dabre and Ratish Puduppully and Anoop Kunchukuttan and Amogh Mishra and Mitesh M. Khapra and Pratyush Kumar},
year={2022},
url = "https://arxiv.org/abs/2203.05437",
```
### Contributions
[Detailed in the paper](https://arxiv.org/abs/2203.05437) |
0Flopper/satoru | ---
license: openrail
---
|
google/wit | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- af
- ar
- ast
- azb
- be
- bg
- bn
- br
- ca
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gl
- hr
- hu
- hy
- id
- it
- iw
- ja
- ka
- ko
- la
- lt
- lv
- mk
- ml
- ms
- nl
- nn
- 'no'
- pl
- pt
- ro
- ru
- sk
- sl
- sr
- sv
- th
- tr
- uk
- ur
- vi
- vo
- zh
license:
- cc-by-sa-3.0
multilinguality:
- multilingual
paperswithcode_id: wit
pretty_name: Wikipedia-based Image Text
size_categories:
- 10M<n<100M
source_datasets:
- original
- extended|wikipedia
task_categories:
- text-retrieval
- image-to-text
task_ids:
- text-retrieval-other-text-image-retrieval
- image-captioning
---
# Dataset Card for WIT
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Preprocessing](#dataset-preprocessing)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [WIT homepage](https://github.com/google-research-datasets/wit)
- **Repository:** [WIT repository](https://github.com/google-research-datasets/wit)
- **Paper:** [WIT: Wikipedia-based Image Text Dataset for Multimodal Multilingual Machine Learning
](https://arxiv.org/abs/2103.01913)
- **Leaderboard:** [WIT leaderboard](https://www.kaggle.com/c/wikipedia-image-caption)
- **Point of Contact:** [WIT e-mail](mailto:wit-dataset@google.com)
### Dataset Summary
Wikipedia-based Image Text (WIT) Dataset is a large multimodal multilingual dataset. WIT is composed of a curated set of 37.6 million entity rich image-text examples with 11.5 million unique images across 108 Wikipedia languages. Its size enables WIT to be used as a pretraining dataset for multimodal machine learning models.
A few unique advantages of WIT:
* The largest multimodal dataset (time of this writing) by the number of image-text examples.
* A massively multilingual (first of its kind) with coverage for over 100+ languages.
* A collection of diverse set of concepts and real world entities.
* Brings forth challenging real-world test sets.
### Dataset Preprocessing
This dataset doesn't download the images locally by default. Instead, it exposes URLs to the images. To fetch the images, use the following code:
```python
from concurrent.futures import ThreadPoolExecutor
from functools import partial
import io
import urllib
import PIL.Image
from datasets import load_dataset
from datasets.utils.file_utils import get_datasets_user_agent
def fetch_single_image(image_url, timeout=None, retries=0):
for _ in range(retries + 1):
try:
request = urllib.request.Request(
image_url,
data=None,
headers={"user-agent": get_datasets_user_agent()},
)
with urllib.request.urlopen(request, timeout=timeout) as req:
image = PIL.Image.open(io.BytesIO(req.read()))
break
except Exception:
image = None
return image
def fetch_images(batch, num_threads, timeout=None, retries=0):
fetch_single_image_with_args = partial(fetch_single_image, timeout=timeout, retries=retries)
with ThreadPoolExecutor(max_workers=num_threads) as executor:
batch["image"] = list(executor.map(fetch_single_image_with_args, batch["image_url"]))
return batch
num_threads = 20
dset = load_dataset("wit")
dset = dset.map(fetch_images, batched=True, batch_size=100, fn_kwargs={"num_threads": num_threads})
```
### Supported Tasks and Leaderboards
- `image-captioning`: This dataset can be used to train a model for image captioning where the goal is to predict a caption given the image.
- `text-retrieval`: The goal in this task is to build a model that retrieves the text closest to an image.
In these tasks, any combination of the `caption_reference_description`, `caption_attribution_description` and `caption_alt_text_description` fields can be used as the input text/caption.
### Languages
The dataset contains examples from all Wikipedia languages, with the following stats:
Image-Text | # Lang | Uniq. Images | # Lang
------------ | ------ | ------------- | ------
total > 1M | 9 | images > 1M | 6
total > 500K | 10 | images > 500K | 12
total > 100K | 36 | images > 100K | 35
total > 50K | 15 | images > 50K | 17
total > 14K | 38 | images > 13K | 38
## Dataset Structure
### Data Instances
```
{
'language': 'en',
'page_url': 'https://en.wikipedia.org/wiki/Oxydactylus',
'image_url': 'https://upload.wikimedia.org/wikipedia/commons/5/5f/Oxydactylus_longipes_fm.jpg',
'page_title': 'Oxydactylus',
'section_title': None,
'hierarchical_section_title': 'Oxydactylus',
'caption_reference_description': None,
'caption_attribution_description': 'English: Mounted skeleton of Oxydactylus longipes in the Field Museum of Natural History.',
'caption_alt_text_description': None,
'mime_type': 'image/jpeg',
'original_height': 3564,
'original_width': 2748,
'is_main_image': True,
'attribution_passes_lang_id': True,
'page_changed_recently': True,
'context_page_description': 'Oxydactylus is an extinct genus of camelid endemic to North America. It lived from the Late Oligocene to the Middle Miocene, existing for approximately 14 million years. The name is from the Ancient Greek οξύς and δάκτυλος.\nThey had very long legs and necks, and were probably adapted to eating high vegetation, much like modern giraffes. Unlike modern camelids, they had hooves, rather than tough sole-pads, and splayed toes.',
'context_section_description': 'Oxydactylus is an extinct genus of camelid endemic to North America. It lived from the Late Oligocene to the Middle Miocene (28.4–13.7 mya), existing for approximately 14 million years. The name is from the Ancient Greek οξύς (oxys, "sharp")and δάκτυλος (daktylos, "finger").\n \nThey had very long legs and necks, and were probably adapted to eating high vegetation, much like modern giraffes. Unlike modern camelids, they had hooves, rather than tough sole-pads, and splayed toes.'
}
```
### Data Fields
- `language`: Language code depicting wikipedia language of the page
- `page_url`: URL to wikipedia page
- `image_url`: URL to wikipedia image
- `page_title`: Wikipedia page's title
- `section_title`: Section's title
- `hierarchical_section_title`: Hierarchical section's title
- `caption_reference_description`: This is the caption that is visible on the wiki page directly below the image.
- `caption_attribution_description`: This is the text found on the Wikimedia page of the image. This text is common to all occurrences of that image across all Wikipedias and thus can be in a language different to the original page article.
- `caption_alt_text_description`: This is the “alt” text associated with the image. While not visible in general, it is commonly used for accessibility / screen readers
- `mime_type`: Mime type associated to the image.
- `original_height`: Image height
- `original_width`: Image width
- `is_main_image`: Flag determining if the image is the first image of the page. Usually displayed on the top-right part of the page when using web browsers.
- `attribution_passes_lang_id`: Compared `language` field with the attribution language (written in the prefix of the attribution description).
- `page_changed_recently`: [More Information Needed]
- `context_page_description`: Page description corresponds to the short description of the page. It provides a concise explanation of the scope of the page.
- `context_section_description`: Text within the image's section.
<p align='center'>
<img width='75%' src='https://production-media.paperswithcode.com/datasets/Screenshot_2021-03-04_at_14.26.02.png' alt="Half Dome" /> </br>
<b>Figure: WIT annotation example. </b>
</p>
Details on the field content can be found directly in the [paper, figure 5 and table 12.](https://arxiv.org/abs/2103.01913)
### Data Splits
All data is held in `train` split, with a total of 37046386 rows.
## Dataset Creation
### Curation Rationale
From the [repository](https://github.com/google-research-datasets/wit#motivation):
> Multimodal visio-linguistic models rely on a rich dataset to help them learn to model the relationship between images and texts. Having large image-text datasets can significantly improve performance, as shown by recent works. Furthermore the lack of language coverage in existing datasets (which are mostly only in English) also impedes research in the multilingual multimodal space – we consider this a lost opportunity given the potential shown in leveraging images (as a language-agnostic medium) to help improve our multilingual textual understanding.
>
> To address these challenges and advance research on multilingual, multimodal learning we created the Wikipedia-based Image Text (WIT) Dataset. WIT is created by extracting multiple different texts associated with an image (e.g., as shown in the above image) from Wikipedia articles and Wikimedia image links. This was accompanied by rigorous filtering to only retain high quality image-text sets.
>
> The resulting dataset contains over 37.6 million image-text sets – making WIT the largest multimodal dataset (publicly available at the time of this writing) with unparalleled multilingual coverage – with 12K+ examples in each of 108 languages (53 languages have 100K+ image-text pairs).
### Source Data
#### Initial Data Collection and Normalization
From the [paper, section 3.1](https://arxiv.org/abs/2103.01913):
> We started with all Wikipedia content pages (i.e., ignoring other
pages that have discussions, comments and such). These number about ∼124M pages across 279 languages.
#### Who are the source language producers?
Text was extracted from Wikipedia.
### Annotations
#### Annotation process
WIT was constructed using an automatic process. However it was human-validated.
From the [paper, section 3.7](https://arxiv.org/abs/2103.01913):
> To further verify the quality of the WIT dataset we performed a
study using (crowd-sourced) human annotators. As seen in Fig. 3,
we asked raters to answer 3 questions. Given an image and the page
title, raters first evaluate the quality of the attribution description
and reference description in the first two questions (order randomized). The third question understands the contextual quality of these
text descriptions given the page description and caption. Each response is on a 3-point scale: "Yes" if the text perfectly describes
the image, "Maybe" if it is sufficiently explanatory and "No" if it is
irrelevant or the image is inappropriate.
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
From the [paper, section 3.4](https://arxiv.org/abs/2103.01913):
> Lastly we found that certain image-text pairs occurred very
frequently. These were often generic images that did not have
much to do with the main article page. Common examples
included flags, logos, maps, insignia and such. To prevent
biasing the data, we heavily under-sampled all such images
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@article{srinivasan2021wit,
title={WIT: Wikipedia-based Image Text Dataset for Multimodal Multilingual Machine Learning},
author={Srinivasan, Krishna and Raman, Karthik and Chen, Jiecao and Bendersky, Michael and Najork, Marc},
journal={arXiv preprint arXiv:2103.01913},
year={2021}
}
```
### Contributions
Thanks to [@thomasw21](https://github.com/thomasw21), [@nateraw](https://github.com/nateraw) and [hassiahk](https://github.com/hassiahk) for adding this dataset. |
tr416/dataset_20231006_231107 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73705
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_231107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
formido/recipes-20k | ---
language:
- en
license: apache-2.0
---
|
Falah/countries_jokes_dataset | ---
dataset_info:
features:
- name: country
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 44275
num_examples: 504
download_size: 16467
dataset_size: 44275
---
# Dataset Card for "countries_jokes_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Avil/bengali_Female_Voice | ---
license: openrail
---
|
truocpham/fintechqa | ---
license: mit
---
|
Ezell/test | ---
license: apache-2.0
task_categories:
- token-classification
- summarization
language:
- ab
- am
- ar
tags:
- biology
size_categories:
- 10K<n<100K
--- |
CyberHarem/meltryllis_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of meltryllis/メルトリリス/Meltryllis (Fate/Grand Order)
This is the dataset of meltryllis/メルトリリス/Meltryllis (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `long_hair, purple_hair, blue_eyes, very_long_hair, ribbon, hair_ribbon, breasts, blue_ribbon, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 844.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/meltryllis_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 724.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/meltryllis_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1252 | 1.33 GiB | [Download](https://huggingface.co/datasets/CyberHarem/meltryllis_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/meltryllis_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, solo, smile, upper_body, white_background, closed_mouth, simple_background, sleeves_past_fingers |
| 1 | 5 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, simple_background, sleeves_past_wrists, solo, white_background, armored_boots, navel, smile, ass, purple_eyes |
| 2 | 15 |  |  |  |  |  | 1girl, armored_boots, long_sleeves, solo, navel, looking_at_viewer, crotch_plate, sleeves_past_fingers, prosthetic_leg, closed_mouth, thighs |
| 3 | 5 |  |  |  |  |  | 1girl, armored_boots, blush, juliet_sleeves, looking_at_viewer, navel, solo, sleeves_past_fingers, spikes, open_mouth, revealing_clothes, thighs, :d, ass, thighhighs |
| 4 | 9 |  |  |  |  |  | 1girl, black_jacket, blue_bow, hood_up, long_sleeves, looking_at_viewer, penguin_hood, sleeves_past_fingers, solo, black_bikini, choker, collarbone, bare_shoulders, closed_mouth, blush, hair_between_eyes, thighs, smile |
| 5 | 6 |  |  |  |  |  | 1girl, black_jacket, blue_bow, choker, closed_mouth, collarbone, hair_between_eyes, hood_up, looking_at_viewer, penguin_hood, smile, solo, :q, bare_shoulders, licking_lips, simple_background, white_background, black_bikini, blush, long_sleeves, sleeves_past_fingers, upper_body |
| 6 | 38 |  |  |  |  |  | 1girl, bare_shoulders, blue_one-piece_swimsuit, frills, highleg_swimsuit, long_sleeves, looking_at_viewer, solo, puffy_sleeves, thighs, sleeves_past_fingers, choker, off_shoulder, collarbone, white_background, closed_mouth, blush, simple_background, covered_navel, smile, frilled_one-piece_swimsuit, off-shoulder_one-piece_swimsuit |
| 7 | 7 |  |  |  |  |  | 1girl, bare_shoulders, collarbone, frilled_bikini, solo, choker, earrings, long_sleeves, looking_at_viewer, navel, off_shoulder, puffy_sleeves, sleeves_past_fingers, closed_mouth, smile, white_bikini, side_ponytail, water |
| 8 | 7 |  |  |  |  |  | 1girl, black_pants, long_sleeves, looking_at_viewer, solo, blue_jacket, blush, midriff, navel, open_jacket, sleeves_past_fingers, cropped_jacket, necklace, collarbone, smile, thighs |
| 9 | 5 |  |  |  |  |  | 1girl, closed_mouth, looking_at_viewer, short_shorts, solo, belt, black_footwear, black_shorts, blush, earrings, long_sleeves, midriff, navel, open_jacket, thighs, bare_shoulders, black_shirt, character_name, crop_top, full_body, off_shoulder, outdoors, sky, sleeveless_shirt, sleeves_past_fingers, white_jacket, absurdly_long_hair, alternate_costume, black_jacket, black_thighhighs, building, choker, collarbone, hooded_jacket, stomach, thigh_boots, white_shirt, zipper_pull_tab |
| 10 | 12 |  |  |  |  |  | 1girl, black_skirt, looking_at_viewer, solo, long_sleeves, blush, pleated_skirt, sitting, prosthetic_leg, sailor_collar, serafuku, simple_background, white_background, white_shirt, neckerchief, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | looking_at_viewer | solo | smile | upper_body | white_background | closed_mouth | simple_background | sleeves_past_fingers | sleeves_past_wrists | armored_boots | navel | ass | purple_eyes | crotch_plate | prosthetic_leg | thighs | blush | juliet_sleeves | spikes | open_mouth | revealing_clothes | :d | thighhighs | black_jacket | blue_bow | hood_up | penguin_hood | black_bikini | choker | collarbone | bare_shoulders | hair_between_eyes | :q | licking_lips | blue_one-piece_swimsuit | frills | highleg_swimsuit | puffy_sleeves | off_shoulder | covered_navel | frilled_one-piece_swimsuit | off-shoulder_one-piece_swimsuit | frilled_bikini | earrings | white_bikini | side_ponytail | water | black_pants | blue_jacket | midriff | open_jacket | cropped_jacket | necklace | short_shorts | belt | black_footwear | black_shorts | black_shirt | character_name | crop_top | full_body | outdoors | sky | sleeveless_shirt | white_jacket | absurdly_long_hair | alternate_costume | black_thighhighs | building | hooded_jacket | stomach | thigh_boots | white_shirt | zipper_pull_tab | black_skirt | pleated_skirt | sitting | sailor_collar | serafuku | neckerchief |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:--------------------|:-------|:--------|:-------------|:-------------------|:---------------|:--------------------|:-----------------------|:----------------------|:----------------|:--------|:------|:--------------|:---------------|:-----------------|:---------|:--------|:-----------------|:---------|:-------------|:--------------------|:-----|:-------------|:---------------|:-----------|:----------|:---------------|:---------------|:---------|:-------------|:-----------------|:--------------------|:-----|:---------------|:--------------------------|:---------|:-------------------|:----------------|:---------------|:----------------|:-----------------------------|:----------------------------------|:-----------------|:-----------|:---------------|:----------------|:--------|:--------------|:--------------|:----------|:--------------|:-----------------|:-----------|:---------------|:-------|:-----------------|:---------------|:--------------|:-----------------|:-----------|:------------|:-----------|:------|:-------------------|:---------------|:---------------------|:--------------------|:-------------------|:-----------|:----------------|:----------|:--------------|:--------------|:------------------|:--------------|:----------------|:----------|:----------------|:-----------|:--------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | X | X | | | | X | | X | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | X | | | | | | X | | X | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | X | X | | | X | | X | | | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 38 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | | | | | | | | X | X | | | | | | | | | | | | X | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | X | X | | | X | | X | | | X | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | X | X | X | X | | | | | X | | | X | | | | | X | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | X | X | | | | X | | X | | | X | | | | | X | X | | | | | | | X | | | | | X | X | X | | | | | | | | X | | | | | X | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 10 | 12 |  |  |  |  |  | X | X | X | X | | | X | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X |
|
ducha07/audio_HTV_thoisu | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: start_time
dtype: float64
- name: end_time
dtype: float64
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 158305000.91
num_examples: 4322
download_size: 161213186
dataset_size: 158305000.91
---
# Dataset Card for "audio_HTV_thoisu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ohara_mari_lovelivesunshine | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ohara_mari/小原鞠莉/오하라마리 (Love Live! Sunshine!!)
This is the dataset of ohara_mari/小原鞠莉/오하라마리 (Love Live! Sunshine!!), containing 500 images and their tags.
The core tags of this character are `blonde_hair, yellow_eyes, braid, hair_rings, crown_braid, medium_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 708.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ohara_mari_lovelivesunshine/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 370.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ohara_mari_lovelivesunshine/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1213 | 832.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ohara_mari_lovelivesunshine/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 607.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ohara_mari_lovelivesunshine/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1213 | 1.20 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ohara_mari_lovelivesunshine/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ohara_mari_lovelivesunshine',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, serafuku, solo, uranohoshi_school_uniform, simple_background, smile, white_background, short_sleeves, green_eyes, upper_body, blush, aqua_neckerchief, skirt |
| 1 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, upper_body, blush, dated, hair_ornament, happy_birthday, open_mouth, bangs, english_text, long_hair, short_sleeves, green_eyes, star_(symbol) |
| 2 | 5 |  |  |  |  |  | 1girl, cloud, looking_at_viewer, sky, solo, blush, long_hair, sunset, mini_hat, open_mouth, outdoors, upper_body, :d, bow, long_sleeves, top_hat |
| 3 | 6 |  |  |  |  |  | 1girl, solo, blush, earrings, looking_at_viewer, smile, green_eyes, horns, long_hair, blue_dress, cape, purple_gloves, simple_background, thighhighs, white_background |
| 4 | 6 |  |  |  |  |  | 1girl, hair_flower, looking_at_viewer, solo, blue_skirt, long_sleeves, blush, boots, green_eyes, grin, neckerchief, one_eye_closed |
| 5 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, solo, obi, smile, yukata, blush, floral_print, food, hair_flower, tongue_out |
| 6 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, sun_hat, white_dress, long_hair, white_headwear, blush, hat_flower, outdoors, hand_on_headwear, necklace, short_sleeves, blue_sky, cloud, day, sleeveless, standing, sundress |
| 7 | 6 |  |  |  |  |  | 1girl, collarbone, looking_at_viewer, off-shoulder_shirt, solo, belt, blush, cloud, day, outdoors, short_sleeves, :d, open_mouth, water, :3, blue_sky, bracelet, pink_shirt, shorts |
| 8 | 6 |  |  |  |  |  | 1girl, hair_ornament, pleated_skirt, school_uniform, smile, solo, striped_bowtie, white_shirt, blush, looking_at_viewer, open_mouth, plaid_skirt, collared_shirt, heart, miniskirt, one_eye_closed, cardigan_around_waist, dress_shirt, holding, leaning_forward, outdoors |
| 9 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, midriff, solo, navel, smile, blush, crop_top, hairclip, one_eye_closed, paint_splatter, pants, bangs, choker, cleavage, fishnet_top, green_eyes, long_hair, long_sleeves, low_twintails, medium_breasts, sidelocks, tattoo, collarbone, heart, open_mouth, purple_jacket |
| 10 | 5 |  |  |  |  |  | 1girl, blush, cat_ears, cat_tail, detached_sleeves, frills, looking_at_viewer, neck_bell, paw_gloves, solo, thighhighs, :3, bangs, bowtie, garter_straps, hair_bow, hairclip, halloween, jack-o'-lantern, layered_skirt, long_sleeves, pink_bow, pumpkin, smile, cat_paws, heart, large_breasts, medium_breasts, on_back, open_mouth, paw_pose, pillow, streaked_hair, striped, tongue_out, twintails, white_sleeves |
| 11 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, white_gloves, blush, elbow_gloves, holding, sparkle, strapless_dress, tiara, wedding_dress, white_dress, bouquet, christmas, fur_trim, hair_flower, hat, necklace, petals, pink_background, purple_rose, skirt_hold |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | serafuku | solo | uranohoshi_school_uniform | simple_background | smile | white_background | short_sleeves | green_eyes | upper_body | blush | aqua_neckerchief | skirt | dated | hair_ornament | happy_birthday | open_mouth | bangs | english_text | long_hair | star_(symbol) | cloud | sky | sunset | mini_hat | outdoors | :d | bow | long_sleeves | top_hat | earrings | horns | blue_dress | cape | purple_gloves | thighhighs | hair_flower | blue_skirt | boots | grin | neckerchief | one_eye_closed | obi | yukata | floral_print | food | tongue_out | sun_hat | white_dress | white_headwear | hat_flower | hand_on_headwear | necklace | blue_sky | day | sleeveless | standing | sundress | collarbone | off-shoulder_shirt | belt | water | :3 | bracelet | pink_shirt | shorts | pleated_skirt | school_uniform | striped_bowtie | white_shirt | plaid_skirt | collared_shirt | heart | miniskirt | cardigan_around_waist | dress_shirt | holding | leaning_forward | midriff | navel | crop_top | hairclip | paint_splatter | pants | choker | cleavage | fishnet_top | low_twintails | medium_breasts | sidelocks | tattoo | purple_jacket | cat_ears | cat_tail | detached_sleeves | frills | neck_bell | paw_gloves | bowtie | garter_straps | hair_bow | halloween | jack-o'-lantern | layered_skirt | pink_bow | pumpkin | cat_paws | large_breasts | on_back | paw_pose | pillow | streaked_hair | striped | twintails | white_sleeves | white_gloves | elbow_gloves | sparkle | strapless_dress | tiara | wedding_dress | bouquet | christmas | fur_trim | hat | petals | pink_background | purple_rose | skirt_hold |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------|:-----------|:-------|:----------------------------|:--------------------|:--------|:-------------------|:----------------|:-------------|:-------------|:--------|:-------------------|:--------|:--------|:----------------|:-----------------|:-------------|:--------|:---------------|:------------|:----------------|:--------|:------|:---------|:-----------|:-----------|:-----|:------|:---------------|:----------|:-----------|:--------|:-------------|:-------|:----------------|:-------------|:--------------|:-------------|:--------|:-------|:--------------|:-----------------|:------|:---------|:---------------|:-------|:-------------|:----------|:--------------|:-----------------|:-------------|:-------------------|:-----------|:-----------|:------|:-------------|:-----------|:-----------|:-------------|:---------------------|:-------|:--------|:-----|:-----------|:-------------|:---------|:----------------|:-----------------|:-----------------|:--------------|:--------------|:-----------------|:--------|:------------|:------------------------|:--------------|:----------|:------------------|:----------|:--------|:-----------|:-----------|:-----------------|:--------|:---------|:-----------|:--------------|:----------------|:-----------------|:------------|:---------|:----------------|:-----------|:-----------|:-------------------|:---------|:------------|:-------------|:---------|:----------------|:-----------|:------------|:------------------|:----------------|:-----------|:----------|:-----------|:----------------|:----------|:-----------|:---------|:----------------|:----------|:------------|:----------------|:---------------|:---------------|:----------|:------------------|:--------|:----------------|:----------|:------------|:-----------|:------|:---------|:------------------|:--------------|:-------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | | | X | | X | X | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | | | | | | | X | X | | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | | X | | X | X | X | | X | | X | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | X | | X | | | X | | X | | | X | | | | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | | X | | | | | X | | | X | | | | | | X | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | | X | | | X | | | | | X | | | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | X | | X | | | X | | | X | | X | | | | | | X | X | | X | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | | X | | | X | | | | | X | | | | | | X | X | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | X | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | X | | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
jeggers/ifqa | ---
language:
- en
license: odc-by
---
IfQA Counterfactual Reasoning Benchmark by [Allen Institute for AI](https://allenai.org/data/ifqa)
Aristo • 2023
Counterfactual reasoning benchmark introduced in the EMNLP-2023 paper titled "IfQA: A Dataset for Open-domain Question Answering under Counterfactual Presuppositions".
License: ODC-BY
IfQA is an open-domain question answering dataset where each of the 3,800 questions, annotated by crowdworkers, is based on a counterfactual presupposition via an ``if’’ clause. For example, if Los Angeles was on the east coast of the U.S., what would be the time difference between Los Angeles and Paris?
Such questions require models to go beyond retrieving direct factual knowledge from the Web: they must identify the right information to retrieve and reason about an imagined situation that may even go against the facts built into their parameters. |
open-llm-leaderboard/details_Kquant03__Azathoth-16x7B-bf16 | ---
pretty_name: Evaluation run of Kquant03/Azathoth-16x7B-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kquant03/Azathoth-16x7B-bf16](https://huggingface.co/Kquant03/Azathoth-16x7B-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Azathoth-16x7B-bf16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-05T13:02:29.525872](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Azathoth-16x7B-bf16/blob/main/results_2024-02-05T13-02-29.525872.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527351745375014,\n\
\ \"acc_stderr\": 0.03209438096547266,\n \"acc_norm\": 0.6516832663399301,\n\
\ \"acc_norm_stderr\": 0.03277230372243864,\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6960737340136303,\n\
\ \"mc2_stderr\": 0.01510691757213546\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n\
\ \"acc_norm\": 0.7380546075085325,\n \"acc_norm_stderr\": 0.012849054826858107\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7253535152360088,\n\
\ \"acc_stderr\": 0.004454237797448359,\n \"acc_norm\": 0.8886675960963951,\n\
\ \"acc_norm_stderr\": 0.003139004815925866\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4435754189944134,\n\
\ \"acc_stderr\": 0.016615680401003724,\n \"acc_norm\": 0.4435754189944134,\n\
\ \"acc_norm_stderr\": 0.016615680401003724\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"\
acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653345,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653345\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406762,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406762\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6960737340136303,\n\
\ \"mc2_stderr\": 0.01510691757213546\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8547750591949487,\n \"acc_stderr\": 0.009902153904760817\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \
\ \"acc_stderr\": 0.012643544762873358\n }\n}\n```"
repo_url: https://huggingface.co/Kquant03/Azathoth-16x7B-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|arc:challenge|25_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|gsm8k|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hellaswag|10_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T13-02-29.525872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T13-02-29.525872.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- '**/details_harness|winogrande|5_2024-02-05T13-02-29.525872.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-05T13-02-29.525872.parquet'
- config_name: results
data_files:
- split: 2024_02_05T13_02_29.525872
path:
- results_2024-02-05T13-02-29.525872.parquet
- split: latest
path:
- results_2024-02-05T13-02-29.525872.parquet
---
# Dataset Card for Evaluation run of Kquant03/Azathoth-16x7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Azathoth-16x7B-bf16](https://huggingface.co/Kquant03/Azathoth-16x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Azathoth-16x7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T13:02:29.525872](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Azathoth-16x7B-bf16/blob/main/results_2024-02-05T13-02-29.525872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527351745375014,
"acc_stderr": 0.03209438096547266,
"acc_norm": 0.6516832663399301,
"acc_norm_stderr": 0.03277230372243864,
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6960737340136303,
"mc2_stderr": 0.01510691757213546
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653886,
"acc_norm": 0.7380546075085325,
"acc_norm_stderr": 0.012849054826858107
},
"harness|hellaswag|10": {
"acc": 0.7253535152360088,
"acc_stderr": 0.004454237797448359,
"acc_norm": 0.8886675960963951,
"acc_norm_stderr": 0.003139004815925866
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4435754189944134,
"acc_stderr": 0.016615680401003724,
"acc_norm": 0.4435754189944134,
"acc_norm_stderr": 0.016615680401003724
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653345,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406762,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406762
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6960737340136303,
"mc2_stderr": 0.01510691757213546
},
"harness|winogrande|5": {
"acc": 0.8547750591949487,
"acc_stderr": 0.009902153904760817
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ai4bharat/indic-instruct-data-v0.1 | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
- hi
multilinguality:
- multilingual
size_categories:
- 5K<n<400K
language_bcp47:
- en-US
- hi-IN
dataset_info:
- config_name: dolly
features:
- name: id
dtype: string
- name: category
dtype: string
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: backtranslated_instruction
dtype: string
- name: backtranslated_context
dtype: string
- name: backtranslated_response
dtype: string
- name: quality_metrics
struct:
- name: chrF
dtype: double
- name: chrF++
dtype: double
- name: sacreBLEU
dtype: double
splits:
- name: en
num_bytes: 12955675
num_examples: 15011
- name: hi
num_bytes: 43020098
num_examples: 15011
- config_name: flan_v2
features:
- name: id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
- name: backtranslated_inputs
dtype: string
- name: backtranslated_targets
dtype: string
- name: quality_metrics
struct:
- name: chrF
dtype: double
- name: chrF++
dtype: double
- name: sacreBLEU
dtype: double
- name: metadata
struct:
- name: _task_name
dtype: string
- name: _task_source
dtype: string
- name: _template_idx
dtype: int64
- name: _template_type
dtype: string
splits:
- name: en
num_bytes: 139835406
num_examples: 67463
- name: hi
num_bytes: 692609723
num_examples: 67463
- config_name: anudesh
features:
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: num_turns
dtype: int64
- name: model
dtype: string
splits:
- name: en
num_bytes: 16957645
num_examples: 5234
- name: hi
num_bytes: 37608562
num_examples: 7577
- config_name: oasst1
features:
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: backtranslated_content
dtype: string
- name: created_date
dtype: string
- name: deleted
dtype: bool
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: emojis
struct:
- name: '+1'
dtype: float64
- name: '-1'
dtype: float64
- name: _skip_labeling
dtype: float64
- name: _skip_ranking
dtype: float64
- name: _skip_reply
dtype: float64
- name: red_flag
dtype: float64
- name: labels
struct:
- name: creativity
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: fails_task
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: hate_speech
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: helpfulness
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: humor
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: lang_mismatch
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: moral_judgement
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: not_appropriate
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: pii
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: political_content
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: quality
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: sexual_content
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: spam
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: toxicity
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: violence
struct:
- name: count
dtype: int64
- name: value
dtype: float64
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: rank
dtype: float64
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: role
dtype: string
- name: synthetic
dtype: bool
- name: text
dtype: string
- name: user_id
dtype: string
- name: quality_metrics
struct:
- name: chrF
dtype: double
- name: chrF++
dtype: double
- name: sacreBLEU
dtype: double
splits:
- name: en
num_bytes: 102808820
num_examples: 19945
- name: hi
num_bytes: 234040644
num_examples: 20128
- config_name: hh-rlhf
features:
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: num_turns
dtype: int64
- name: quality_metrics
struct:
- name: chrF
dtype: double
- name: chrF++
dtype: double
- name: sacreBLEU
dtype: double
splits:
- name: en
num_bytes: 5196642
num_examples: 5000
- name: hi
num_bytes: 12725636
num_examples: 5000
- config_name: nmt-seed
features:
- name: id
dtype: string
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: input_language
dtype: string
- name: output_language
dtype: string
- name: bucket
dtype: string
splits:
- name: hi
num_bytes: 20519477
num_examples: 50000
- config_name: wikihow
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: intro
dtype: string
- name: n_steps
dtype: int64
- name: steps
list:
- name: description
dtype: string
- name: number
dtype: int64
- name: picture
dtype: string
- name: summary
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: en
num_bytes: 262392614
num_examples: 20400
- name: hi
num_bytes: 172318437
num_examples: 6055
- config_name: lm_sys
features:
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: backtranslated_content
dtype: string
- name: role
dtype: string
- name: quality_metrics
struct:
- name: chrF++
dtype: double
splits:
- name: en
num_bytes: 113785744
num_examples: 50000
- name: hi
num_bytes: 381591698
num_examples: 50000
configs:
- config_name: dolly
data_files:
- split: en
path: dolly/en-*
- split: hi
path: dolly/hi-*
- config_name: flan_v2
data_files:
- split: en
path: flan_v2/en-*
- split: hi
path: flan_v2/hi-*
- config_name: anudesh
data_files:
- split: en
path: anudesh/en-*
- split: hi
path: anudesh/hi-*
- config_name: oasst1
data_files:
- split: en
path: oasst1/en-*
- split: hi
path: oasst1/hi-*
- config_name: hh-rlhf
data_files:
- split: en
path: hh-rlhf/en-*
- split: hi
path: hh-rlhf/hi-*
- config_name: nmt-seed
data_files:
- split: hi
path: nmt/en-hi-*
- config_name: wikihow
data_files:
- split: en
path: wikihow/en-*
- split: hi
path: wikihow/hi-*
- config_name: lm_sys
data_files:
- split: en
path: lm_sys/en-*
- split: hi
path: lm_sys/hi-*
---
# Indic Instruct Data v0.1
A collection of different instruction datasets spanning English and Hindi languages. The collection consists of:
- Anudesh
- [wikiHow](https://www.wikihow.com/Main-Page)
- [Flan v2](https://github.com/google-research/FLAN/blob/main/flan/v2/README.md) (67k sample subset)
- [Dolly](https://huggingface.co/datasets/databricks/databricks-dolly-15k)
- [Anthropic-HHH](https://huggingface.co/datasets/Anthropic/hh-rlhf) (5k sample subset)
- [OpenAssistant v1](https://huggingface.co/datasets/OpenAssistant/oasst1)
- [LymSys-Chat](https://huggingface.co/datasets/lmsys/lmsys-chat-1m) (50k sample subset)
We translate the English subset of specific datasets using IndicTrans2 ([Gala et al., 2023](https://openreview.net/forum?id=vfT4YuzAYA)). The chrF++ scores of the back-translated example and the corresponding example is provided for quality assessment of the translated datasets.
We create and release two native Hindi instruction datasets:
- wikiHow: wikiHow is an online wiki-style platform that serves as a valuable resource for a diverse array of how-to articles spanning numerous topics.
- Anudesh: Anudesh is a crowd-sourced collection of prompts accompanied by responses generated from the Llama 2 70B model.
We recommend the readers to check out our [technical report](https://arxiv.org/abs/2401.15006) for more details on the dataset curation process and license.
## Citation
```bibtex
@article{gala2024airavata,
title = {Airavata: Introducing Hindi Instruction-tuned LLM},
author = {Jay Gala and Thanmay Jayakumar and Jaavid Aktar Husain and Aswanth Kumar M and Mohammed Safi Ur Rahman Khan and Diptesh Kanojia and Ratish Puduppully and Mitesh M. Khapra and Raj Dabre and Rudra Murthy and Anoop Kunchukuttan},
year = {2024},
journal = {arXiv preprint arXiv: 2401.15006}
}
```
|
ziozzang/deepl-trans-FR-KO | ---
task_categories:
- translation
language:
- ko
- fr
---
This dataset is some wikipedia article with DeepL translation, auto-aggregated.
# String/Corpus pairs
From FR/French to KO/Korean.
# Quality Filtering
- Stripping whole HTML tags.
- removed references and annotation mark.
- Filtered by string length.
---
The strings/corpus are aggregated from wikipedia(pt) using DeepL translated.
whole data collected by Jioh L. Jung<ziozzang@gmail.com>
license: mit
--- |
ledoc/ultimatevoice | ---
license: apache-2.0
---
|
zolak/twitter_dataset_81_1713054820 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2647909
num_examples: 6553
download_size: 1326254
dataset_size: 2647909
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
makram93/rejected_pairs_st | ---
dataset_info:
features:
- name: url
dtype: string
- name: doc_id
dtype: string
- name: original_title
sequence: string
- name: right
dtype: string
- name: left
dtype: string
splits:
- name: train
num_bytes: 88447.0623234648
num_examples: 100
download_size: 82694
dataset_size: 88447.0623234648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rejected_pairs_st"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-professional_psychology-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 259865
num_examples: 612
download_size: 155435
dataset_size: 259865
---
# Dataset Card for "mmlu-professional_psychology-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Brand24/mms | ---
annotations_creators:
- mixed
language:
- ar
- bg
- bs
- cs
- de
- el
- en
- es
- fa
- fr
- he
- hi
- hr
- hu
- it
- ja
- lv
- pl
- pt
- ru
- sk
- sl
- sq
- sr
- sv
- th
- ur
- zh
license:
- other
multilinguality:
- multi-lingual
size_categories:
- 1M<n<10M
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: Massive-Multilingual-Sentiment
---
# Massive Multilingual Sentiment Corpora (MMS)
## Corpora Summary
Despite impressive advancements in multilingual corpora collection and model training, developing large-scale deployments of multilingual models still presents a significant challenge. This is particularly true for language tasks that are culture-dependent. One such example is the area of multilingual sentiment analysis, where affective markers can be subtle and deeply ensconced in culture.
This work presents the most extensive open massively multilingual corpus of datasets for training sentiment models. The corpus consists of 79 manually selected from over 350 datasets reported in the scientific literature based on strict quality criteria and covers 27 languages. Datasets can be queried using several linguistic and functional features. In addition, we present a multi-faceted sentiment classification benchmark summarizing hundreds of experiments conducted on different base models, training objectives, dataset collections, and fine-tuning strategies.
More about dataset here [https://brand24-ai.github.io/mms_benchmark](https://brand24-ai.github.io/mms_benchmark).
## General licenses information
This is a library of the open-sourced datasets that we gathered. We provide citations or links to sources of these datasets. It is essential to mention that these datasets could have different licenses, and we encourage everybody to check the permissions of each dataset separately. It is critical because, for example, not all datasets will be available for commercial purposes. This ensures that proper consent and permissions are obtained for the use and curation of the data, respecting the rights and privacy of the individuals whose data is included in the datasets. You will cite our library and the authors of each dataset you want to use.
## Usage
```python
import datasets
# whole dataset will be downloaded and cached
mms_dataset = datasets.load_dataset("Brand24/mms")
# filter only texts in Polish
pl = mms_dataset.filter(lambda row: row['language'] == 'pl')
```
## Corpora statistics
### Per language
| language | label_name | count |
|:-----------|:-------------|--------:|
| ar | negative | 138899 |
| ar | neutral | 192774 |
| ar | positive | 600402 |
| bg | negative | 13930 |
| bg | neutral | 28657 |
| bg | positive | 19563 |
| bs | negative | 11974 |
| bs | neutral | 11145 |
| bs | positive | 13064 |
| cs | negative | 39674 |
| cs | neutral | 59200 |
| cs | positive | 97413 |
| de | negative | 104667 |
| de | neutral | 100071 |
| de | positive | 111149 |
| el | negative | 230 |
| el | neutral | 38 |
| el | positive | 232 |
| en | negative | 304939 |
| en | neutral | 290823 |
| en | positive | 1734724 |
| es | negative | 108733 |
| es | neutral | 122493 |
| es | positive | 187486 |
| fa | negative | 1602 |
| fa | neutral | 5091 |
| fa | positive | 6832 |
| fr | negative | 84187 |
| fr | neutral | 43245 |
| fr | positive | 83199 |
| he | negative | 2279 |
| he | neutral | 243 |
| he | positive | 6097 |
| hi | negative | 4992 |
| hi | neutral | 6392 |
| hi | positive | 5615 |
| hr | negative | 19757 |
| hr | neutral | 19470 |
| hr | positive | 38367 |
| hu | negative | 8974 |
| hu | neutral | 17621 |
| hu | positive | 30087 |
| it | negative | 4043 |
| it | neutral | 4193 |
| it | positive | 3829 |
| ja | negative | 83982 |
| ja | neutral | 41979 |
| ja | positive | 83819 |
| lv | negative | 1378 |
| lv | neutral | 2618 |
| lv | positive | 1794 |
| pl | negative | 77422 |
| pl | neutral | 62074 |
| pl | positive | 97192 |
| pt | negative | 56827 |
| pt | neutral | 55165 |
| pt | positive | 45842 |
| ru | negative | 31770 |
| ru | neutral | 48106 |
| ru | positive | 31054 |
| sk | negative | 14431 |
| sk | neutral | 12842 |
| sk | positive | 29350 |
| sl | negative | 33694 |
| sl | neutral | 50553 |
| sl | positive | 29296 |
| sq | negative | 6889 |
| sq | neutral | 14757 |
| sq | positive | 22638 |
| sr | negative | 25089 |
| sr | neutral | 32283 |
| sr | positive | 18996 |
| sv | negative | 16266 |
| sv | neutral | 13342 |
| sv | positive | 11738 |
| th | negative | 9326 |
| th | neutral | 28616 |
| th | positive | 34377 |
| ur | negative | 5239 |
| ur | neutral | 8585 |
| ur | positive | 5836 |
| zh | negative | 117967 |
| zh | neutral | 69016 |
| zh | positive | 144719 |
## Dataset Structure
### Linguistic Typology
The field of language typology focuses on studying the similarities and differences among languages. These differences can be categorized into phonological (sounds), syntactic (structures), lexical (vocabulary), and theoretical aspects. Linguistic typology analyzes the current state of languages, contrasting with genealogical linguistics, which examines historical relationships between languages.
Genealogical linguistics studies language families and genera. A language family consists of languages that share a common ancestral language, while genera are branches within a language family. The Indo-European family, for example, includes genera such as Slavic, Romance, Germanic, and Indic. Over 7000 languages are categorized into approximately 150 language families, with Indo-European, Sino-Tibetan, Turkic, Afro-Asiatic, Nilo-Saharan, Niger-Congo, and Eskimo-Aleut being some of the largest families.
Within linguistic typology, languages are described using various linguistic features. Our work focuses on sentiment classification and selects ten relevant features:
- `text`: The feature text represents the actual text of the sentiment dataset. It is of type string and contains the text samples or sentences for sentiment analysis.
- `label`: The feature label corresponds to the sentiment labels of the text samples. It is of type ClassLabel and has three possible values: negative, neutral, and positive. These labels indicate the sentiment or emotional polarity associated with the text.
- `original_dataset`: The feature original_dataset refers to the name or identifier of the original dataset from which the text samples were extracted. It is of type string and provides information about the source dataset.
- `domain`: The feature domain represents the domain or topic of the sentiment dataset. It is of type string and provides context regarding the subject matter of the text samples.
- `language`: The feature language indicates the language of the text samples in the sentiment dataset. It is of type string and specifies the language in which the text is written.
- `Family`: The feature Family represents the language family to which a specific language belongs. It is of type string and provides information about the broader categorization of languages into language families.
- `Genus`: The feature Genus corresponds to the genus or branch within a language family. It is of type string and indicates the specific subgrouping of languages within a language family.
- `Definite article`: Half of the languages do not use the definite article, which signals uniqueness or definiteness of a concept.
- `Indefinite article`: Half of the languages do not use the indefinite article, with some languages using a separate article or the numeral "one."
- `Number of cases`: Languages vary greatly in the number of morphological cases used.
- `Order of subject, verb, and object`: Different languages have different word orderings, with variations like SOV, SVO, VSO, VOS, OVS, and OSV.
- `Negative morphemes`: Negative morphemes indicate clausal negation in declarative sentences.
- `Polar questions`: Questions with yes/no answers, which can be formed using question particles, interrogative morphology, or intonation.
- `Position of the negative morpheme`: The position of the negative morpheme can vary in relation to subjects and objects.
- `Prefixing vs. suffixing`: Languages differ in their use of prefixes and suffixes in inflectional morphology.
- `Coding of nominal plurals`: Plurals can be expressed through morphological changes or the use of plurality indicator morphemes.
- `Grammatical genders`: Languages vary in the number of grammatical genders used, or may not use the concept at all.
These language features are available as filtering options in our library. Users can download specific facets of the collection, such as datasets in Slavic languages with interrogative word order for polar questions or datasets from the Afro-Asiatic language family without morphological case-making.
### Usage
Code example for loading and filtering Slavic language in which polar questions are formed using the interrogative word order
```python
import datasets
mms_dataset = datasets.load_dataset("Brand24/mms")
slavic = mms_dataset.filter(lambda row: row["Genus"] == "Slavic" and row["Polar questions"] == "interrogative word order")
```
Filtering sentiment datasets from the Afro-Asiatic language family without morphological case-making
```python
afro_asiatic = mms_dataset.filter(lambda row: row["Family"] == "Afro-Asiatic" and row["Number of cases"] == "no morphological case-making")
```
## Dataset Creation
### Who are the source language producers?
The data comes from multiple papers and covers a large variety of languages. For the specific dataset information, please check out the companion paper.
### Annotations
Similarly, like for data producers, you should check papers that propose the specific datasets you are interested in.
#### Annotation process
We describe the annotations process of our internally created dataset in this corpus.
## Considerations for Using the Data
### Social Impact and Limitations
Corpus is intended to bring more sentiment annotated data to a wide variety of lanuages, the aim of the corpus is to make large amounts of data available to lower resource languages in order to facilitate the training of state-of-the-art ML models for sentiment analysis.
## Additional Information
### Dataset Curators
The corpus was put together by
- [@laugustyniak](https://www.linkedin.com/in/lukaszaugustyniak/)
- [@swozniak](https://www.linkedin.com/in/wscode/)
- [@mgruza](https://www.linkedin.com/in/marcin-gruza-276b2512b/)
- [@pgramacki](https://www.linkedin.com/in/piotrgramacki/)
- [@krajda](https://www.linkedin.com/in/krzysztof-rajda/)
- [@mmorzy](https://www.linkedin.com/in/mikolajmorzy/)
- [@tkajdanowicz](https://www.linkedin.com/in/kajdanowicz/)
### Licensing Information
These data are released under this licensing scheme.
We do not own any text from which these data and datasets have been extracted.
We license the actual packaging of these data under the Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) https://creativecommons.org/licenses/by-nc/4.0/
This work is published from Poland.
Should you consider that our data contains material that is owned by you and should, therefore not be reproduced here, please:
* Clearly identify yourself with detailed contact data such as an address, telephone number, or email address at which you can be contacted.
* Clearly identify the copyrighted work claimed to be infringed.
* Clearly identify the material claimed to be infringing and the information reasonably sufficient to allow us to locate the material.
We will comply with legitimate requests by removing the affected sources from the next release of the corpus.
### Citation Information
### The main corpus citation
```bibtex
@misc{augustyniak2023massively,
title={Massively Multilingual Corpus of Sentiment Datasets and Multi-faceted Sentiment Classification Benchmark},
author={Łukasz Augustyniak and Szymon Woźniak and Marcin Gruza and Piotr Gramacki and Krzysztof Rajda and Mikołaj Morzy and Tomasz Kajdanowicz},
year={2023},
eprint={2306.07902},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### All datasets in corpus
[https://brand24-ai.github.io/mms_benchmark/citations.html](https://brand24-ai.github.io/mms_benchmark/citations.html)
## Acknowledgements
- BRAND24 - https://brand24.com
- CLARIN-PL-Biz - https://clarin.biz
|
autoevaluate/autoeval-staging-eval-project-emotion-04ae905d-13795904 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: tbasic5/distilbert-base-uncased-finetuned-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: tbasic5/distilbert-base-uncased-finetuned-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
SEACrowd/indocollex | ---
tags:
- morphological-inflection
language:
- ind
---
# indocollex
IndoCollex: A Testbed for Morphological Transformation of Indonesian Colloquial Words
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{wibowo-etal-2021-indocollex,
title = "{I}ndo{C}ollex: A Testbed for Morphological Transformation of {I}ndonesian Word Colloquialism",
author = {Wibowo, Haryo Akbarianto and Nityasya, Made Nindyatama and Aky{"u}rek, Afra Feyza and Fitriany, Suci and Aji, Alham Fikri and Prasojo, Radityo Eko and Wijaya, Derry Tanti},
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.280",
doi = "10.18653/v1/2021.findings-acl.280",
pages = "3170--3183",
}
```
## License
CC BY-SA 4.0
## Homepage
[https://github.com/haryoa/indo-collex](https://github.com/haryoa/indo-collex)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
Ericu950/ParaLoebSent | ---
task_categories:
- translation
language:
- la
- el
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
Paralell sentences in latin/greek and english
## Dataset Details
### Dataset Description
This is a very preliminary dataset of paralell sentences used for educational purposes to finetune PhilTa and create the rudimentary Ericu950/lagrc-enTransPrel
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anukaver/EstQA | ---
language: et
---
# Estonian Question Answering dataset
* Dataset for extractive question answering in Estonian. It is based on Wikipedia articles, pre-filtered via PageRank. Annotation was done by one person.
* Train set includes 776 context-question-answer triplets. There are several possible answers per question, each in a separate triplet. Number of different questions is 512.
* Test set includes 603 samples. Each sample contains one or more golden answers. Altogether there are 892 golden ansewrs.
### Change log
Test set v1.1 adds some more golden answers.
### Reference
If you use this dataset for research, please cite the following paper:
```
@mastersthesis{mastersthesis,
author = {Anu Käver},
title = {Extractive Question Answering for Estonian Language},
school = {Tallinn University of Technology (TalTech)},
year = 2021
}
``` |
noxneural/alb_wiki | ---
task_categories:
- question-answering
language:
- sq
pretty_name: Albanian Wiki
size_categories:
- 100K<n<1M
--- |
Lk123/S2S_L2L | ---
license: apache-2.0
---
|
hts98/ptf-dataset | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 19917891520
num_examples: 20736
- name: test
num_bytes: 4980426456
num_examples: 5185
download_size: 4001768209
dataset_size: 24898317976
---
# Dataset Card for "ptf-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sabilmakbar/sea_wiki | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- ace
- ban
- bcl
- bjn
- bug
- cbk
- ceb
- gor
- id
- ilo
- jv
- km
- lo
- mad
- min
- mnw
- ms
- my
- nia
- pag
- pam
- shn
- su
- ta
- th
- tl
- tet
- vi
- war
license:
- cc-by-sa-4.0
multilinguality:
- multilingual
source_datasets:
- Wikipedia
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
pretty_name: Wikipedia Archive for SEA Languages
tags:
- Wikipedia
- Southeast Asia (SEA)
- Dialect
- Banyumasan Dialect of Javanese (Ngapak)
- SEA-related Languages
- SEA Local Languages
dataset_info:
- config_name: seawiki_all
features:
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: ace
num_bytes: 4952102
num_examples: 13003
- name: ban
num_bytes: 18198909
num_examples: 20987
- name: bcl
num_bytes: 20258642
num_examples: 15743
- name: bjn
num_bytes: 6792259
num_examples: 10519
- name: bug
num_bytes: 3298561
num_examples: 15880
- name: cbk_zam
num_bytes: 2033238
num_examples: 3285
- name: ceb
num_bytes: 4572804910
num_examples: 6302896
- name: gor
num_bytes: 6239133
num_examples: 15359
- name: id
num_bytes: 1118834498
num_examples: 665622
- name: ilo
num_bytes: 16719139
num_examples: 15371
- name: jv
num_bytes: 72101470
num_examples: 73380
- name: km
num_bytes: 103146669
num_examples: 11994
- name: lo
num_bytes: 15240262
num_examples: 5014
- name: mad
num_bytes: 1612542
num_examples: 1192
- name: map_bms
num_bytes: 5221506
num_examples: 13580
- name: min
num_bytes: 116824020
num_examples: 227143
- name: mnw
num_bytes: 47321734
num_examples: 3296
- name: ms
num_bytes: 419662356
num_examples: 368628
- name: my
num_bytes: 313370839
num_examples: 109310
- name: nia
num_bytes: 2153274
num_examples: 1714
- name: pag
num_bytes: 1370162
num_examples: 2665
- name: pam
num_bytes: 8218370
num_examples: 9006
- name: shn
num_bytes: 33754296
num_examples: 13945
- name: su
num_bytes: 47516268
num_examples: 61555
- name: ta
num_bytes: 809156746
num_examples: 160651
- name: tet
num_bytes: 1454499
num_examples: 1468
- name: th
num_bytes: 1012930269
num_examples: 159719
- name: tl
num_bytes: 85356818
num_examples: 45341
- name: vi
num_bytes: 1603057633
num_examples: 1288680
- name: war
num_bytes: 454304567
num_examples: 1266394
download_size: 1829748651
dataset_size: 10923905691
- config_name: seawiki_dedup_all
features:
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: ace
num_bytes: 4944916
num_examples: 12979
- name: ban
num_bytes: 18025267
num_examples: 20611
- name: bcl
num_bytes: 19977232
num_examples: 14079
- name: bjn
num_bytes: 6786207
num_examples: 10503
- name: bug
num_bytes: 2182435
num_examples: 9969
- name: cbk_zam
num_bytes: 1579651
num_examples: 2242
- name: ceb
num_bytes: 4346511153
num_examples: 5815254
- name: gor
num_bytes: 6217480
num_examples: 15290
- name: id
num_bytes: 1117891512
num_examples: 662443
- name: ilo
num_bytes: 16719001
num_examples: 15369
- name: jv
num_bytes: 71997517
num_examples: 73080
- name: km
num_bytes: 102698901
num_examples: 11466
- name: lo
num_bytes: 14908444
num_examples: 4897
- name: mad
num_bytes: 1612542
num_examples: 1192
- name: map_bms
num_bytes: 5067489
num_examples: 11839
- name: min
num_bytes: 116721269
num_examples: 225972
- name: mnw
num_bytes: 47243333
num_examples: 3271
- name: ms
num_bytes: 414783365
num_examples: 348045
- name: my
num_bytes: 312990457
num_examples: 108819
- name: nia
num_bytes: 2153274
num_examples: 1714
- name: pag
num_bytes: 764869
num_examples: 1108
- name: pam
num_bytes: 8205723
num_examples: 8932
- name: shn
num_bytes: 33616591
num_examples: 13662
- name: su
num_bytes: 47512744
num_examples: 61529
- name: ta
num_bytes: 809061339
num_examples: 160580
- name: tet
num_bytes: 1452151
num_examples: 1464
- name: th
num_bytes: 1012868861
num_examples: 159666
- name: tl
num_bytes: 85286023
num_examples: 45121
- name: vi
num_bytes: 1602830022
num_examples: 1287912
- name: war
num_bytes: 454266479
num_examples: 1266204
download_size: 1811459996
dataset_size: 10686876247
- config_name: seawiki_with_countries_all
features:
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: brn_ms
num_bytes: 419662356
num_examples: 368628
- name: idn_ace
num_bytes: 4952102
num_examples: 13003
- name: idn_ban
num_bytes: 18198909
num_examples: 20987
- name: idn_bjn
num_bytes: 6792259
num_examples: 10519
- name: idn_bug
num_bytes: 3298561
num_examples: 15880
- name: idn_gor
num_bytes: 6239133
num_examples: 15359
- name: idn_id
num_bytes: 1118834498
num_examples: 665622
- name: idn_jv
num_bytes: 72101470
num_examples: 73380
- name: idn_mad
num_bytes: 1612542
num_examples: 1192
- name: idn_map_bms
num_bytes: 5221506
num_examples: 13580
- name: idn_min
num_bytes: 116824020
num_examples: 227143
- name: idn_ms
num_bytes: 419662356
num_examples: 368628
- name: idn_nia
num_bytes: 2153274
num_examples: 1714
- name: idn_su
num_bytes: 47516268
num_examples: 61555
- name: idn_tet
num_bytes: 1454499
num_examples: 1468
- name: khm_km
num_bytes: 103146669
num_examples: 11994
- name: lao_lo
num_bytes: 15240262
num_examples: 5014
- name: mmr_my
num_bytes: 313370839
num_examples: 109310
- name: mmr_shn
num_bytes: 33754296
num_examples: 13945
- name: mmr_mnw
num_bytes: 47321734
num_examples: 3296
- name: mys_ms
num_bytes: 419662356
num_examples: 368628
- name: mys_ta
num_bytes: 809156746
num_examples: 160651
- name: phl_war
num_bytes: 454304567
num_examples: 1266394
- name: phl_tl
num_bytes: 85356818
num_examples: 45341
- name: phl_ilo
num_bytes: 16719139
num_examples: 15371
- name: phl_bcl
num_bytes: 20258642
num_examples: 15743
- name: phl_pam
num_bytes: 8218370
num_examples: 9006
- name: phl_cbk_zam
num_bytes: 2033238
num_examples: 3285
- name: phl_pag
num_bytes: 1370162
num_examples: 2665
- name: phl_ceb
num_bytes: 4572804910
num_examples: 6302896
- name: sgp_ms
num_bytes: 419662356
num_examples: 368628
- name: sgp_ta
num_bytes: 809156746
num_examples: 160651
- name: tha_th
num_bytes: 1012930269
num_examples: 159719
- name: tha_mnw
num_bytes: 47321734
num_examples: 3296
- name: tha_shn
num_bytes: 33754296
num_examples: 13945
- name: tls_tet
num_bytes: 1454499
num_examples: 1468
- name: vnm_vi
num_bytes: 1603057633
num_examples: 1288680
download_size: 1829748651
dataset_size: 13074580034
- config_name: seawiki_with_countries_dedup_all
features:
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: brn_ms
num_bytes: 414783365
num_examples: 348045
- name: idn_ace
num_bytes: 4944916
num_examples: 12979
- name: idn_ban
num_bytes: 18025267
num_examples: 20611
- name: idn_bjn
num_bytes: 6786207
num_examples: 10503
- name: idn_bug
num_bytes: 2182435
num_examples: 9969
- name: idn_gor
num_bytes: 6217480
num_examples: 15290
- name: idn_id
num_bytes: 1117891512
num_examples: 662443
- name: idn_jv
num_bytes: 71997517
num_examples: 73080
- name: idn_mad
num_bytes: 1612542
num_examples: 1192
- name: idn_map_bms
num_bytes: 5067489
num_examples: 11839
- name: idn_min
num_bytes: 116721269
num_examples: 225972
- name: idn_ms
num_bytes: 414783365
num_examples: 348045
- name: idn_nia
num_bytes: 2153274
num_examples: 1714
- name: idn_su
num_bytes: 47512744
num_examples: 61529
- name: idn_tet
num_bytes: 1452151
num_examples: 1464
- name: khm_km
num_bytes: 102698901
num_examples: 11466
- name: lao_lo
num_bytes: 14908444
num_examples: 4897
- name: mmr_my
num_bytes: 312990457
num_examples: 108819
- name: mmr_shn
num_bytes: 33616591
num_examples: 13662
- name: mmr_mnw
num_bytes: 47243333
num_examples: 3271
- name: mys_ms
num_bytes: 414783365
num_examples: 348045
- name: mys_ta
num_bytes: 809061339
num_examples: 160580
- name: phl_war
num_bytes: 454266479
num_examples: 1266204
- name: phl_tl
num_bytes: 85286023
num_examples: 45121
- name: phl_ilo
num_bytes: 16719001
num_examples: 15369
- name: phl_bcl
num_bytes: 19977232
num_examples: 14079
- name: phl_pam
num_bytes: 8205723
num_examples: 8932
- name: phl_cbk_zam
num_bytes: 1579651
num_examples: 2242
- name: phl_pag
num_bytes: 764869
num_examples: 1108
- name: phl_ceb
num_bytes: 4346511153
num_examples: 5815254
- name: sgp_ms
num_bytes: 414783365
num_examples: 348045
- name: sgp_ta
num_bytes: 809061339
num_examples: 160580
- name: tha_th
num_bytes: 1012868861
num_examples: 159666
- name: tha_mnw
num_bytes: 47243333
num_examples: 3271
- name: tha_shn
num_bytes: 33616591
num_examples: 13662
- name: tls_tet
num_bytes: 1452151
num_examples: 1464
- name: vnm_vi
num_bytes: 1602830022
num_examples: 1287912
download_size: 1811459996
dataset_size: 12822599756
---
# **SEA Wikipedia Data Repository**
---
Welcome to SEA Wikipedia Data Repository. The datasets are extracted from [Wikipedia HF](https://huggingface.co/datasets/wikipedia) and processed using the scripts available in this repository for reproducibility purpose. Since Wikipedia iteslf has license [cc-by-sa 4.0](https://en.wikipedia.org/wiki/Wikipedia:Copyrights), we decided to follow this instead of Wikipedia HF data has of cc-by-sa 3.0 since it gives more rights to initial author/contributor.
# Getting Started #
### To read the datasets directly ###
Use one of the following code chunks to load it from HuggingFace Hub:
You can refer to the 2nd args of ```config name``` using the following script
```
dataset = load_dataset(
"sabilmakbar/sea_wiki",
"seawiki_dedup_all" # a config name, can be "seawiki_dedup_all" or "seawiki_with_countries_all", or "seawiki_with_countries_dedup_all" , defaults to "seawiki_dedup_all"
)
```
Or you can provide both ```lang``` and ```date_stamp``` (or just lang only by assuming the ```date_stamp``` will take the newest one)
```
dataset = load_dataset(
"sabilmakbar/sea_wiki",
lang = "id", # see README for complete lang choices
date_stamp="20230901"
)
```
Or you can provide a ```country``` params with similar fashion to ```lang``` args (providing both ```country``` and ```lang``` will prioritize the ```lang``` kwarg)
```
dataset = load_dataset(
"sabilmakbar/sea_wiki",
lang = "id", # see the splits for complete lang choices
date_stamp="20230901"
)
```
# **FAQS**
### What are the available languages provided in dataset and from which country?
You may check the following tables to understand the current coverage of this dataset (languages, countries, data size & volume). All tables are sorted by the leftmost column.
#### 1. Table of Countries and its Country Code
| Country Code | Country Name | Wiki Info |
| :---: | :---: | :---: |
| brn | Brunei | [Wiki Link](https://en.wikipedia.org/wiki/Brunei) |
| idn | Indonesia | [Wiki Link](https://en.wikipedia.org/wiki/Indonesia) |
| khm | Cambodia | [Wiki Link](https://en.wikipedia.org/wiki/Cambodia) |
| lao | Laos | [Wiki Link](https://en.wikipedia.org/wiki/Laos) |
| mmr | Myanmar | [Wiki Link](https://en.wikipedia.org/wiki/Myanmar) |
| mys | Malaysia | [Wiki Link](https://en.wikipedia.org/wiki/Malaysia) |
| phl | Philippines | [Wiki Link](https://en.wikipedia.org/wiki/Philippines) |
| sgp | Singapore | [Wiki Link](https://en.wikipedia.org/wiki/Singapore) |
| tha | Thailand | [Wiki Link](https://en.wikipedia.org/wiki/Thailand) |
| tls | East Timor | [Wiki Link](https://en.wikipedia.org/wiki/East_Timor) |
| vnm | Vietnam | [Wiki Link](https://en.wikipedia.org/wiki/Vietnam) |
#### 2. Table of Languages and Countries of its speakers
| ISO 639-3 Lang Code | Dataset Lang Code | Lang Name | Country Codes Spoken | Wiki Info | Total Data | Total Size (MiB rounded) |
| :---: | :---: | :---: | :---: | :--- | ---: | ---: |
| ace | ace | Acehnese | idn | [Wiki Link](https://en.wikipedia.org/wiki/Acehnese_language) | 12979 | 4.72 |
| ban | ban | Balinese | idn | [Wiki Link](https://en.wikipedia.org/wiki/Balinese_language) | 20611 | 17.19 |
| bcl | bcl | Central Bicolano | phl | [Wiki Link](https://en.wikipedia.org/wiki/Banjarese_language) | 14079 | 19.05 |
| bjn | bjn | Banjarese | idn | [Wiki Link](https://en.wikipedia.org/wiki/Banjarese_language) | 10503 | 6.47 |
| bug | bug | Buginese | idn | [Wiki Link](https://en.wikipedia.org/wiki/Buginese_language) | 9969 | 2.08 |
| bur | my | Burmese | mmr | [Wiki Link](https://en.wikipedia.org/wiki/Burmese_language) | 108819 | 298.49 |
| cbk | cbk_zam | Zamboanga Chavacano/Chavacano | phl | [Wiki Link](https://en.wikipedia.org/wiki/Chavacano) | 2242 | 1.51 |
| ceb | ceb | Cebuano | phl | [Wiki Link](https://en.wikipedia.org/wiki/Central_Bicolano_language) | 5815254 | 4,145.16 |
| gor | gor | Gorontalo | idn | [Wiki Link](https://en.wikipedia.org/wiki/Gorontalo_language) | 15290 | 5.93 |
| ilo | ilo | Ilokano | phl | [Wiki Link](https://en.wikipedia.org/wiki/Ilocano_language) | 15369 | 15.94 |
| ind | id | Indonesian | idn | [Wiki Link](https://en.wikipedia.org/wiki/Indonesian_language) | 662443 | 1,066.10 |
| jav | jv | Javanese | idn | [Wiki Link](https://en.wikipedia.org/wiki/Javanese_language) | 73080 | 68.66 |
| khm | km | Khmer | khm | [Wiki Link](https://en.wikipedia.org/wiki/Khmer_language) | 11466 | 97.94 |
| lao | lo | Lao | lao | [Wiki Link](https://en.wikipedia.org/wiki/Lao_language) | 4897 | 14.22 |
| mad | mad | Madurese | idn | [Wiki Link](https://en.wikipedia.org/wiki/Madurese_language) | 1192 | 1.54 |
| may | ms | Malay | mys, sgp, brn, idn | [Wiki Link](https://en.wikipedia.org/wiki/Malay_language) | 348045 | 395.57 |
| min | min | Minangkabau | idn | [Wiki Link](https://en.wikipedia.org/wiki/Minangkabau_language) | 225972 | 111.31 |
| mnw | mnw | Mon | mmr | [Wiki Link](https://en.wikipedia.org/wiki/Mon_language) | 3271 | 45.05 |
| nia | nia | Nias | idn | [Wiki Link](https://en.wikipedia.org/wiki/Nias_language) | 1714 | 2.05 |
| pag | pag | Pangasinan | phl | [Wiki Link](https://en.wikipedia.org/wiki/Pangasinan_language) | 1108 | 0.73 |
| pam | pam | Kapampangan | phl | [Wiki Link](https://en.wikipedia.org/wiki/Kapampangan_language) | 8932 | 7.83 |
| shn | shn | Shan | mmr | [Wiki Link](https://en.wikipedia.org/wiki/Shan_language) | 13662 | 32.06 |
| sun | su | Sundanese | idn | [Wiki Link](https://en.wikipedia.org/wiki/Sundanese_language) | 61529 | 45.31 |
| tam | ta | Tamil | mys, sgp | [Wiki Link](https://en.wikipedia.org/wiki/Tamil_language) | 160580 | 771.58 |
| tgl | tl | Tagalog | phl | [Wiki Link](https://en.wikipedia.org/wiki/Tagalog_language) | 45121 | 81.34 |
| tha | th | Thai | tha | [Wiki Link](https://en.wikipedia.org/wiki/Thai_language) | 159666 | 965.95 |
| tet | tet | Tetum | tls, idn | [Wiki Link](https://en.wikipedia.org/wiki/Tetum_language) | 1464 | 1.38 |
| vie | vi | Vietnamese | vnm | [Wiki Link](https://en.wikipedia.org/wiki/Vietnamese_language) | 1287912 | 1,528.58 |
| war | war | Waray | phl | [Wiki Link](https://en.wikipedia.org/wiki/Waray_language) | 1266204 | 433.22 |
| (dialect) | map_bms | Banyumasan <br>(Dialect of Javanese) | idn | [Wiki Link](https://en.wikipedia.org/wiki/Banyumasan_dialect) | 11839 | 4.83 |
#### 3. Table of Token Statistics for Covered Languages
The token statistics is generated using ```tiktoken``` using encoder for GPT-4.
| Dataset Lang Code | Total Token | Avg Token per Article | Min Token | Max Token | Token Deciles List |
| :---: | ---: | ---: | ---: | ---: | :--- |
| ace | 1,370,829 | 105.61899992295247 | 3 | 9,659 | [38.0, 52.0, 54.0, 69.0, 76.0, 84.0, 90.0, 123.0, 126.0] |
| ban | 5,924,610 | 287.44893503469024 | 5 | 24,364 | [97.0, 144.0, 165.0, 187.0, 209.0, 245.0, 276.0, 315.0, 421.0] |
| bcl | 6,234,838 | 442.8466510405569 | 2 | 54,049 | [55.0, 95.0, 143.0, 179.0, 226.0, 304.0, 419.0, 587.0, 917.2] |
| bjn | 1,935,505 | 184.28115776444827 | 2 | 30,170 | [36.0, 38.0, 39.0, 40.0, 42.0, 51.0, 82.0, 151.0, 367.0] |
| bug | 553,693 | 55.54147858360919 | 1 | 13,951 | [31.0, 42.0, 43.0, 46.0, 48.0, 50.0, 52.0, 55.0, 57.0] |
| cbk_zam | 402,703 | 179.6177520071365 | 2 | 6,494 | [35.0, 41.2, 56.0, 69.0, 90.0, 120.0, 138.0, 155.0, 294.9] |
| ceb | 1,319,601,771 | 226.92074516435568 | 4 | 221,802 | [93.0, 108.0, 123.0, 136.0, 163.0, 207.0, 278.0, 377.0, 426.0] |
| gor | 1,575,766 | 103.05860039241334 | 2 | 5,525 | [55.0, 58.0, 60.0, 62.0, 64.0, 66.0, 69.0, 75.0, 96.0] |
| id | 325,411,713 | 491.22975561670967 | 1 | 198,597 | [54.0, 93.0, 123.0, 145.0, 180.0, 226.0, 332.0, 543.0, 1068.0] |
| ilo | 5,593,491 | 363.94632051532307 | 17 | 18,202 | [59.0, 80.0, 91.0, 111.0, 152.0, 213.0, 303.0, 461.0, 856.0] |
| jv | 23,528,314 | 321.95284619594963 | 2 | 342,156 | [48.0, 60.0, 75.0, 88.0, 117.0, 175.0, 270.0, 420.0, 772.0] |
| km | 54,559,721 | 4,758.391854177568 | 1 | 1,110,771 | [160.0, 293.0, 452.0, 693.0, 1032.0, 1609.0, 2644.0, 4745.0, 9607.0] |
| lo | 9,395,636 | 1,918.6514192362672 | 3 | 107,154 | [134.0, 184.2, 285.0, 494.0, 658.0, 894.6, 1258.0, 1971.2, 4153.8] |
| mad | 611,736 | 513.2013422818792 | 14 | 17,093 | [80.1, 110.2, 135.0, 161.0, 194.0, 242.0, 302.7, 531.4, 1167.1] |
| map_bms | 1,307,244 | 110.41844750401216 | 1 | 20,629 | [20.0, 21.0, 22.0, 24.0, 30.0, 35.0, 36.0, 38.0, 111.0] |
| min | 33,114,184 | 146.54109358681606 | 3 | 58,387 | [81.0, 91.0, 96.0, 108.0, 119.0, 135.0, 156.0, 168.0, 170.0] |
| mnw | 31,595,647 | 9,659.3234484867 | 6 | 1,450,765 | [425.0, 601.0, 629.0, 682.0, 763.0, 2103.0, 4255.0, 7724.0, 14517.0] |
| ms | 121,343,673 | 348.64363228892813 | 1 | 68,545 | [32.0, 40.0, 49.0, 63.0, 105.0, 138.0, 216.0, 362.0, 788.0] |
| my | 189,439,447 | 1,740.8673761015998 | 10 | 1,376,658 | [164.0, 269.0, 350.0, 508.0, 559.0, 578.0, 605.0, 892.4, 3369.0] |
| nia | 795,527 | 464.134772462077 | 8 | 18,650 | [59.0, 61.0, 63.0, 65.0, 67.0, 86.0, 239.1, 623.4, 1249.7] |
| pag | 222,366 | 200.6913357400722 | 5 | 10,143 | [31.0, 51.0, 73.0, 110.0, 118.0, 120.0, 127.0, 181.0, 355.8] |
| pam | 2,269,091 | 254.04064039408868 | 1 | 14,912 | [38.0, 56.0, 78.0, 108.0, 121.0, 150.0, 193.0, 289.0, 525.8] |
| shn | 23,125,637 | 1,692.6977748499487 | 2 | 204,094 | [460.0, 480.0, 585.0, 679.0, 715.0, 740.0, 756.0, 780.0, 1580.9] |
| su | 14,710,124 | 239.07627297697022 | 1 | 99,456 | [41.0, 43.0, 45.0, 49.0, 70.0, 146.0, 216.0, 219.0, 419.0] |
| ta | 376,043,508 | 2,341.782961763607 | 15 | 177,054 | [543.0, 700.0, 824.0, 1001.0, 1153.0, 1465.0, 1992.0, 2911.0, 4652.0] |
| tet | 487,016 | 332.6612021857924 | 4 | 24,287 | [30.3, 47.0, 66.9, 101.0, 164.0, 177.0, 187.0, 248.6, 604.4] |
| th | 330,964,733 | 2,072.8566695476807 | 1 | 289,150 | [231.0, 390.0, 546.0, 727.0, 969.0, 1276.0, 1741.0, 2533.0, 4361.0] |
| tl | 27,789,730 | 615.8934864032269 | 7 | 60,728 | [73.0, 116.0, 161.0, 214.0, 281.0, 360.0, 465.0, 666.0, 1136.0] |
| vi | 546,481,913 | 424.3161900813099 | 3 | 246,463 | [46.0, 64.0, 71.0, 80.0, 86.0, 92.0, 120.0, 240.0, 824.0] |
| war | 117,438,315 | 92.74833676090108 | 1 | 25,689 | [60.0, 77.0, 81.0, 84.0, 87.0, 90.0, 94.0, 99.0, 110.0] |
Some other languages in SEA that are already exists its Wiki Index at Wikimedia might be missing from this list. Any lang update PR is greatly appreciated!
### How does the data being preprocessed? What makes it different from loading it directly from Wikipedia HF?
The data available in here are processed with following flows:
1. Raw data is being deduplicated on ```title``` and ```text``` (text-content from a given article), to remove articles containing boilerplate text (template text that are used usually for unavailable informations or asking for contributions of content in that article), which usually deemed noisy for NLP data.
2. Furthermore, the ```title``` and ```text``` data are being checked for string-matching duplication (duplication of text that are being pre-processed, i.e symbols removed, HTML tags striped, or ASCII-chars/UTF-8 chars validated).
The source code can be found on this Github Repo [SEA Wiki Github Source Code](https://github.com/sabilmakbar/sea_wiki)
### How do I extract new Wikipedia Dataset of SEA languages?
Please refer to the corresponding Github Repo for more detailed info [SEA Wiki Github Source Code](https://github.com/sabilmakbar/sea_wiki)
## Citation Info:
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"}
@ONLINE{wikipedia-hf,
title = "Huggingface Wikipedia Dataset",
url = "https://huggingface.co/datasets/wikipedia"}
```
|
Apinapi/Lucas | ---
license: openrail
---
|
mahdibaghbanzadeh/GUE_prom_prom_core_all | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: labels
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 3883192
num_examples: 47356
- name: val
num_bytes: 485440
num_examples: 5920
- name: test
num_bytes: 485440
num_examples: 5920
download_size: 2198997
dataset_size: 4854072
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
ribhu/muse_synthetic | ---
license: agpl-3.0
---
|
Cemspule636/Core_database | ---
license: bigscience-openrail-m
---
|
SauravMaheshkar/pareto-cora | ---
size_categories:
- 1K<n<10K
task_categories:
- graph-ml
license: cc
---
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 19,793 | 126,842 | 8,710 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
```
|
Dikshaa-malhotra/reviewssyntheticdata100 | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: critique
dtype: string
splits:
- name: train
num_bytes: 125175
num_examples: 100
download_size: 90149
dataset_size: 125175
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "reviewssyntheticdata100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ashutosh09/galaxy | ---
license: apache-2.0
---
|
dltdojo/hfh4_oasst1_zh | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 30744254.277176227
num_examples: 19034
- name: test
num_bytes: 3416207.722823774
num_examples: 2115
- name: train_ift
num_bytes: 30744254.277176227
num_examples: 19034
- name: test_ift
num_bytes: 3416207.722823774
num_examples: 2115
download_size: 37300334
dataset_size: 68320924.0
---
# Dataset Card for "hfh4_oasst1_zh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sukantan/nyaya-ae-all-mpnet-base-v2-legal-v1 | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
splits:
- name: train
num_bytes: 138405888
num_examples: 45054
download_size: 189510817
dataset_size: 138405888
---
# Dataset Card for "nyaya-ae-all-mpnet-base-v2-legal-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
imdatta0/gsm_8k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4075297
num_examples: 7473
- name: test
num_bytes: 140555
num_examples: 251
download_size: 2246650
dataset_size: 4215852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_AdaptLLM__medicine-chat | ---
pretty_name: Evaluation run of AdaptLLM/medicine-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AdaptLLM/medicine-chat](https://huggingface.co/AdaptLLM/medicine-chat) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AdaptLLM__medicine-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T00:10:29.742802](https://huggingface.co/datasets/open-llm-leaderboard/details_AdaptLLM__medicine-chat/blob/main/results_2024-01-05T00-10-29.742802.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49991587255247455,\n\
\ \"acc_stderr\": 0.034306220187286095,\n \"acc_norm\": 0.5048893452943634,\n\
\ \"acc_norm_stderr\": 0.035069432938668016,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4346323175004823,\n\
\ \"mc2_stderr\": 0.01476152876710364\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4931740614334471,\n \"acc_stderr\": 0.014610029151379813,\n\
\ \"acc_norm\": 0.537542662116041,\n \"acc_norm_stderr\": 0.01457014449507558\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5654252141007767,\n\
\ \"acc_stderr\": 0.004946879874422681,\n \"acc_norm\": 0.7611033658633738,\n\
\ \"acc_norm_stderr\": 0.004255380050015102\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5096774193548387,\n \"acc_stderr\": 0.02843867799890955,\n \"\
acc_norm\": 0.5096774193548387,\n \"acc_norm_stderr\": 0.02843867799890955\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n \"\
acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6464646464646465,\n \"acc_stderr\": 0.03406086723547155,\n \"\
acc_norm\": 0.6464646464646465,\n \"acc_norm_stderr\": 0.03406086723547155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.03161877917935413,\n\
\ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.03161877917935413\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240634,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240634\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415182,\n \"\
acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415182\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373616,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373616\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899616,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935437,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935437\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6960408684546615,\n\
\ \"acc_stderr\": 0.016448321686769043,\n \"acc_norm\": 0.6960408684546615,\n\
\ \"acc_norm_stderr\": 0.016448321686769043\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.014487500852850414,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.014487500852850414\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138293,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138293\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5401234567901234,\n \"acc_stderr\": 0.027731022753539277,\n\
\ \"acc_norm\": 0.5401234567901234,\n \"acc_norm_stderr\": 0.027731022753539277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3644067796610169,\n\
\ \"acc_stderr\": 0.012291694983056479,\n \"acc_norm\": 0.3644067796610169,\n\
\ \"acc_norm_stderr\": 0.012291694983056479\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4852941176470588,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.03368787466115459,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.03368787466115459\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4346323175004823,\n\
\ \"mc2_stderr\": 0.01476152876710364\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.012055665630431036\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18953752843062927,\n \
\ \"acc_stderr\": 0.010795837931896387\n }\n}\n```"
repo_url: https://huggingface.co/AdaptLLM/medicine-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-10-29.742802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-10-29.742802.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- '**/details_harness|winogrande|5_2024-01-05T00-10-29.742802.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T00-10-29.742802.parquet'
- config_name: results
data_files:
- split: 2024_01_05T00_10_29.742802
path:
- results_2024-01-05T00-10-29.742802.parquet
- split: latest
path:
- results_2024-01-05T00-10-29.742802.parquet
---
# Dataset Card for Evaluation run of AdaptLLM/medicine-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AdaptLLM/medicine-chat](https://huggingface.co/AdaptLLM/medicine-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AdaptLLM__medicine-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T00:10:29.742802](https://huggingface.co/datasets/open-llm-leaderboard/details_AdaptLLM__medicine-chat/blob/main/results_2024-01-05T00-10-29.742802.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49991587255247455,
"acc_stderr": 0.034306220187286095,
"acc_norm": 0.5048893452943634,
"acc_norm_stderr": 0.035069432938668016,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4346323175004823,
"mc2_stderr": 0.01476152876710364
},
"harness|arc:challenge|25": {
"acc": 0.4931740614334471,
"acc_stderr": 0.014610029151379813,
"acc_norm": 0.537542662116041,
"acc_norm_stderr": 0.01457014449507558
},
"harness|hellaswag|10": {
"acc": 0.5654252141007767,
"acc_stderr": 0.004946879874422681,
"acc_norm": 0.7611033658633738,
"acc_norm_stderr": 0.004255380050015102
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.03065674869673943,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.03065674869673943
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6464646464646465,
"acc_stderr": 0.03406086723547155,
"acc_norm": 0.6464646464646465,
"acc_norm_stderr": 0.03406086723547155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7409326424870466,
"acc_stderr": 0.03161877917935413,
"acc_norm": 0.7409326424870466,
"acc_norm_stderr": 0.03161877917935413
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240634,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240634
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7009174311926606,
"acc_stderr": 0.019630417285415182,
"acc_norm": 0.7009174311926606,
"acc_norm_stderr": 0.019630417285415182
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373616,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373616
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.033141902221106564,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.033141902221106564
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935437,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935437
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6960408684546615,
"acc_stderr": 0.016448321686769043,
"acc_norm": 0.6960408684546615,
"acc_norm_stderr": 0.016448321686769043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850414,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850414
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138293,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138293
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946208,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946208
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5401234567901234,
"acc_stderr": 0.027731022753539277,
"acc_norm": 0.5401234567901234,
"acc_norm_stderr": 0.027731022753539277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3644067796610169,
"acc_stderr": 0.012291694983056479,
"acc_norm": 0.3644067796610169,
"acc_norm_stderr": 0.012291694983056479
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.03368787466115459,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.03368787466115459
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4346323175004823,
"mc2_stderr": 0.01476152876710364
},
"harness|winogrande|5": {
"acc": 0.7569060773480663,
"acc_stderr": 0.012055665630431036
},
"harness|gsm8k|5": {
"acc": 0.18953752843062927,
"acc_stderr": 0.010795837931896387
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Hamalia/autotrain-data-triageinfojuri | ---
language:
- fr
task_categories:
- text-classification
---
# AutoTrain Dataset for project: triageinfojuri
## Dataset Description
This dataset has been automatically processed by AutoTrain for project triageinfojuri.
### Languages
The BCP-47 code for the dataset's language is fr.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"target": 11,
"text": "Je suis intervenant psychosociale depuis 2007 au Regroupement des familles monoparentales et recompos\u00e9es de Laval (RFMRL), organisme sans but lucratif. L\u2019organisme n\u2019offre aucune compensation pour les 14 premiers jours. Je suis en Arr\u00eat de travail du 28 novembre au 12 d\u00e9cembre 2022 inclusivement pour \u00e9puisement professionnel pour raison m\u00e9dicale. J\u2019aimerais savoir comment ca fonctionne ? - M\u2019inscrire tout de suite \u00e0 l\u2019AE (15 semaines avant le 18 d\u00e9cembre ou 26 si apr\u00e8s le 18). Je peux attendre ? - M\u2019inscrire pour indemnit\u00e9 de la CNESST en m\u00eame temps ? - On m\u2019a sugg\u00e9r\u00e9 de prendre des vacances."
},
{
"target": 8,
"text": "J'ai achet\u00e9 une voiture dans un encan, je l'ai fait r\u00e9par\u00e9e chez un garagiste \u00e0 Trois Rivi\u00e8res (Andr\u00e9 Simon Inc.) et la derni\u00e8re estim\u00e9e \u00e9tais de 3500 et la facture finale pr\u00e8s de 10,000 tentative de ne n\u00e9gociation nulle,mercci"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"target": "ClassLabel(names=['ADMINISTRATIF', 'AIN\u00c9S', 'ASSURANCES', 'AUTEUR', 'CONSOMMATION', 'CONSOMMMATION', 'CORPORATIF', 'CRIMINALIT\u00c9', 'CR\u00c9ANCES', 'DIVORCE', 'DPJ', 'EMPLOI', 'FAILLITE', 'FAMILLE', 'FISCALIT\u00c9', 'FRAUDE', 'INAPTITUDE', 'INTERNATIONAL', 'JEUNESSE', 'LOGEMENT', 'LOUAGE', 'L\u00c9SION', 'MUNICIPAL', 'MUNICPAL', 'PERSONNALIT\u00c9', 'PLUMITIFS', 'PROC\u00c9DURES ', 'PROFESSION', 'PROPRI\u00c9T\u00c9', 'Proc\u00e9dures ', 'RENSEIGNEMENTS', 'RESPONSABILIT\u00c9', 'SERVICE', 'SOINS', 'VENTE'], id=None)",
"text": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 619 |
| valid | 174 |
|
breno30/Alcyr2024 | ---
license: openrail
---
|
SEACrowd/indspeech_newstra_ethnicsr | ---
tags:
- speech-recognition
language:
- sun
- jav
- btk
- ban
---
# indspeech_newstra_ethnicsr
INDspeech_NEWSTRA_EthnicSR is a collection of graphemically balanced and parallel speech corpora of four major Indonesian ethnic languages: Javanese, Sundanese, Balinese, and Bataks. It was developed in 2013 by the Nara Institute of Science and Technology (NAIST, Japan) [Sakti et al., 2013]. The data has been used to develop Indonesian ethnic speech recognition in supervised learning [Sakti et al., 2014] and semi-supervised learning [Novitasari et al., 2020] based on Machine Speech Chain framework [Tjandra et al., 2020].
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{sakti-cocosda-2013,
title = "Towards Language Preservation: Design and Collection of Graphemically Balanced and Parallel Speech Corpora of {I}ndonesian Ethnic Languages",
author = "Sakti, Sakriani and Nakamura, Satoshi",
booktitle = "Proc. Oriental COCOSDA",
year = "2013",
address = "Gurgaon, India"
}
@inproceedings{sakti-sltu-2014,
title = "Recent progress in developing grapheme-based speech recognition for {I}ndonesian ethnic languages: {J}avanese, {S}undanese, {B}alinese and {B}ataks",
author = "Sakti, Sakriani and Nakamura, Satoshi",
booktitle = "Proc. 4th Workshop on Spoken Language Technologies for Under-Resourced Languages (SLTU 2014)",
year = "2014",
pages = "46--52",
address = "St. Petersburg, Russia"
}
@inproceedings{novitasari-sltu-2020,
title = "Cross-Lingual Machine Speech Chain for {J}avanese, {S}undanese, {B}alinese, and {B}ataks Speech Recognition and Synthesis",
author = "Novitasari, Sashi and Tjandra, Andros and Sakti, Sakriani and Nakamura, Satoshi",
booktitle = "Proc. Joint Workshop on Spoken Language Technologies for Under-resourced languages (SLTU) and Collaboration and Computing for Under-Resourced Languages (CCURL)",
year = "2020",
pages = "131--138",
address = "Marseille, France"
}
```
## License
CC-BY-NC-SA 4.0
## Homepage
[https://github.com/s-sakti/data_indsp_newstra_ethnicsr](https://github.com/s-sakti/data_indsp_newstra_ethnicsr)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
yzhuang/autotree_automl_pol_gosdt_l512_d3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 13320800000
num_examples: 100000
- name: validation
num_bytes: 1332080000
num_examples: 10000
download_size: 960924312
dataset_size: 14652880000
---
# Dataset Card for "autotree_automl_pol_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
veswaran/sample-dataset | ---
license: mit
---
|
agicorp/Agentinstruct | ---
configs:
- config_name: default
data_files:
- split: os
path: data/os-*
- split: db
path: data/db-*
- split: alfworld
path: data/alfworld-*
- split: webshop
path: data/webshop-*
- split: kg
path: data/kg-*
- split: mind2web
path: data/mind2web-*
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: loss
dtype: bool
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: os
num_bytes: 660245
num_examples: 195
- name: db
num_bytes: 1436655
num_examples: 538
- name: alfworld
num_bytes: 1223363
num_examples: 336
- name: webshop
num_bytes: 1602648
num_examples: 351
- name: kg
num_bytes: 2960010
num_examples: 324
- name: mind2web
num_bytes: 159590
num_examples: 122
download_size: 1255385
dataset_size: 8042511
language:
- en
pretty_name: AgentInstruct
---
# AgentInstruct Dataset
<p align="center">
🤗 <a href="https://huggingface.co/THUDM/agentlm-70b" target="_blank">[Models]</a> • 💻 <a href="https://github.com/THUDM/AgentTuning" target="_blank">[Github Repo]</a> • 📌 <a href="https://THUDM.github.io/AgentTuning/" target="_blank">[Project Page]</a> • 📃 <a href="https://arxiv.org/abs/2310.12823" target="_blank">[Paper]</a>
</p>
**AgentInstruct** is a meticulously curated dataset featuring **1,866** high-quality interactions, designed to enhance AI agents across six diverse real-world tasks, leveraging innovative methods like **Task Derivation** and **Self-Instruct**.
- 🔍 **CoT** - Harness the power of [ReAct](https://react-lm.github.io/), offering detailed thought explanations for each action, ensuring an intricate understanding of the model's decision-making journey.
- 🌍 **Diversity** - Spanning 6 real-world scenarios, from Daily Household Routines to Database Operations, and their average turns range from 5 to 35.
- 🎯 **Precision** - Not all trajectories of GPT-4 are effective! Ours are rigorously filtered using strict rewards to ensure top-notch quality.
- ✅ **Assurance** - Rigorous checks to avoid data leakage, ensuring pristine dataset quality.
## Task Overview
| Task | # Filt. Traj. | Avg # Filt. Traj. Turns |
|---|---|---|
|ALFWorld|336|13.52|
|WebShop|351|3.68|
|Mind2Web|122|1.00|
|Knowledge Graph|324|6.04|
|Operating System|195|3.85|
|Database|538|2.06|
|**AgentInstruct**|1866|5.24|
AgentInstruct includes 1,866 trajectories from
6 agents tasks. "Traj." stands for interaction trajectory. "Filt. Traj."
stands for filtered trajectories.
## Models
**AgentLM** models are produced by mixed training on AgentInstruct dataset and ShareGPT dataset from Llama-2-chat models.
The models follow the conversation format of [Llama-2-chat](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), with system prompt fixed as
```
You are a helpful, respectful and honest assistant.
```
7B, 13B, and 70B models are available on Huggingface model hub.
|Model|Huggingface Repo|
|---|---|
|AgentLM-7B| [🤗Huggingface Repo](https://huggingface.co/THUDM/agentlm-7b) |
|AgentLM-13B| [🤗Huggingface Repo](https://huggingface.co/THUDM/agentlm-13b) |
|AgentLM-70B| [🤗Huggingface Repo](https://huggingface.co/THUDM/agentlm-70b) |
Check our [[Github Repo]](https://github.com/THUDM/AgentTuning) for details about **AgentTuning**.
## Citation
If you find our work useful, please consider citing AgentTuning:
```
@misc{zeng2023agenttuning,
title={AgentTuning: Enabling Generalized Agent Abilities for LLMs},
author={Aohan Zeng and Mingdao Liu and Rui Lu and Bowen Wang and Xiao Liu and Yuxiao Dong and Jie Tang},
year={2023},
eprint={2310.12823},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
tyzhu/squad_qa_wrong_rare_v5_full_recite_ans_sent_random_permute_rerun_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10209997.191323694
num_examples: 6305
- name: validation
num_bytes: 409972
num_examples: 300
download_size: 1674811
dataset_size: 10619969.191323694
---
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_ans_sent_random_permute_rerun_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jeromtom/msme-compliance | ---
license: apache-2.0
---
|
uyjh/ytgytgh | ---
license: apache-2.0
---
|
kkkkkkkkkkkkkkk/fff | ---
license: openrail
---
|
k0ntra/chainreaction | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
splits:
- name: train
num_bytes: 81408
num_examples: 53
download_size: 258600
dataset_size: 81408
---
# Dataset Card for "chainreaction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Mihaiii__Bucharest-0.2 | ---
pretty_name: Evaluation run of Mihaiii/Bucharest-0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mihaiii/Bucharest-0.2](https://huggingface.co/Mihaiii/Bucharest-0.2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Bucharest-0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-17T19:59:50.888473](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Bucharest-0.2/blob/main/results_2024-02-17T19-59-50.888473.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6604284110771916,\n\
\ \"acc_stderr\": 0.03143827349144454,\n \"acc_norm\": 0.663246611615963,\n\
\ \"acc_norm_stderr\": 0.03206864715833573,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.45302642542372934,\n\
\ \"mc2_stderr\": 0.014396635503520975\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449703,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756557\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6565425214100776,\n\
\ \"acc_stderr\": 0.004738920624724467,\n \"acc_norm\": 0.8487353116908982,\n\
\ \"acc_norm_stderr\": 0.003575744098779953\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\
\ \"acc_stderr\": 0.02261640942074202,\n \"acc_norm\": 0.8032258064516129,\n\
\ \"acc_norm_stderr\": 0.02261640942074202\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.01742697415424053,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.01742697415424053\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028072,\n \"\
acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028072\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990905,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990905\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104428,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104428\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3340782122905028,\n\
\ \"acc_stderr\": 0.015774911422381622,\n \"acc_norm\": 0.3340782122905028,\n\
\ \"acc_norm_stderr\": 0.015774911422381622\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.023152722439402303,\n\
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.023152722439402303\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4876140808344198,\n\
\ \"acc_stderr\": 0.012766317315473556,\n \"acc_norm\": 0.4876140808344198,\n\
\ \"acc_norm_stderr\": 0.012766317315473556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.0267114305555384,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.0267114305555384\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466115,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466115\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.45302642542372934,\n\
\ \"mc2_stderr\": 0.014396635503520975\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828079\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5678544351781653,\n \
\ \"acc_stderr\": 0.013645072137842445\n }\n}\n```"
repo_url: https://huggingface.co/Mihaiii/Bucharest-0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-59-50.888473.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-59-50.888473.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- '**/details_harness|winogrande|5_2024-02-17T19-59-50.888473.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-17T19-59-50.888473.parquet'
- config_name: results
data_files:
- split: 2024_02_17T19_59_50.888473
path:
- results_2024-02-17T19-59-50.888473.parquet
- split: latest
path:
- results_2024-02-17T19-59-50.888473.parquet
---
# Dataset Card for Evaluation run of Mihaiii/Bucharest-0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Bucharest-0.2](https://huggingface.co/Mihaiii/Bucharest-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Bucharest-0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T19:59:50.888473](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Bucharest-0.2/blob/main/results_2024-02-17T19-59-50.888473.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6604284110771916,
"acc_stderr": 0.03143827349144454,
"acc_norm": 0.663246611615963,
"acc_norm_stderr": 0.03206864715833573,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.45302642542372934,
"mc2_stderr": 0.014396635503520975
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449703,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756557
},
"harness|hellaswag|10": {
"acc": 0.6565425214100776,
"acc_stderr": 0.004738920624724467,
"acc_norm": 0.8487353116908982,
"acc_norm_stderr": 0.003575744098779953
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.02261640942074202,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.02261640942074202
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950357,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.01742697415424053,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.01742697415424053
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028072,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028072
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990905,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990905
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3340782122905028,
"acc_stderr": 0.015774911422381622,
"acc_norm": 0.3340782122905028,
"acc_norm_stderr": 0.015774911422381622
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.023152722439402303,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.023152722439402303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4876140808344198,
"acc_stderr": 0.012766317315473556,
"acc_norm": 0.4876140808344198,
"acc_norm_stderr": 0.012766317315473556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.0267114305555384,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.0267114305555384
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466115,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466115
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.45302642542372934,
"mc2_stderr": 0.014396635503520975
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828079
},
"harness|gsm8k|5": {
"acc": 0.5678544351781653,
"acc_stderr": 0.013645072137842445
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.