datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
irds/lotte_technology_test | ---
pretty_name: '`lotte/technology/test`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/technology/test`
The `lotte/technology/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/technology/test).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=638,509
This dataset is used by: [`lotte_technology_test_forum`](https://huggingface.co/datasets/irds/lotte_technology_test_forum), [`lotte_technology_test_search`](https://huggingface.co/datasets/irds/lotte_technology_test_search)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/lotte_technology_test', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
CVasNLPExperiments/Food101_10samples_class_test_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_1010 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 449184
num_examples: 1010
download_size: 80441
dataset_size: 449184
---
# Dataset Card for "Food101_10samples_class_test_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_1010"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liblinear/small-eng-russian-paintings-t2i | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 993887.0
num_examples: 6
download_size: 995744
dataset_size: 993887.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-78000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1001745
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bigscience-data/roots_en_the_pile_uspto | ---
language: en
license: mit
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_en_the_pile_uspto
# the_pile_uspto
- Dataset uid: `the_pile_uspto`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.5358 % of total
- 2.9032 % of en
### BigScience processing steps
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
|
FaalSa/data9 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 17309
num_examples: 1
- name: validation
num_bytes: 17789
num_examples: 1
- name: test
num_bytes: 18269
num_examples: 1
download_size: 16390
dataset_size: 53367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
SEACrowd/indspeech_news_tts | ---
tags:
- text-to-speech
language:
- ind
---
# INDspeech_NEWS_TTS
INDspeech_NEWS_TTS is a speech dataset for developing an Indonesian text-to-speech synthesis system. The data was developed by Advanced Telecommunication Research Institute International (ATR) Japan under the the Asian speech translation advanced research (A-STAR) project [Sakti et al., 2013].
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{sakti-tts-cocosda-2008,
title = "Development of HMM-based Indonesian Speech Synthesis",
author = "Sakti, Sakriani and Maia, Ranniery and Sakai, Shinsuke and Nakamura, Satoshi",
booktitle = "Proc. Oriental COCOSDA",
year = "2008",
pages = "215--220"
address = "Kyoto, Japan"
}
@inproceedings{sakti-tts-malindo-2010,
title = "Quality and Intelligibility Assessment of Indonesian HMM-Based Speech Synthesis System",
author = "Sakti, Sakriani and Sakai, Shinsuke and Isotani, Ryosuke and Kawai, Hisashi and Nakamura, Satoshi",
booktitle = "Proc. MALINDO",
year = "2010",
pages = "51--57"
address = "Jakarta, Indonesia"
}
@article{sakti-s2st-csl-2013,
title = "{A-STAR}: Toward Tranlating Asian Spoken Languages",
author = "Sakti, Sakriani and Paul, Michael and Finch, Andrew and Sakai, Shinsuke and Thang, Tat Vu, and Kimura, Noriyuki
and Hori, Chiori and Sumita, Eiichiro and Nakamura, Satoshi and Park, Jun and Wutiwiwatchai, Chai and Xu, Bo and Riza, Hammam
and Arora, Karunesh and Luong, Chi Mai and Li, Haizhou",
journal = "Special issue on Speech-to-Speech Translation, Computer Speech and Language Journal",
volume = "27",
number ="2",
pages = "509--527",
year = "2013",
publisher = "Elsevier"
}
```
## License
CC-BY-NC-SA 4.0
## Homepage
[https://github.com/s-sakti/data_indsp_news_tts](https://github.com/s-sakti/data_indsp_news_tts)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
LeoTungAnh/kdd210_hourly_96 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: feat_static_cat
sequence: uint64
- name: feat_dynamic_real
sequence:
sequence: float32
- name: item_id
dtype: string
- name: target
sequence: float64
splits:
- name: train
num_bytes: 17993559
num_examples: 210
- name: validation
num_bytes: 18154839
num_examples: 210
- name: test
num_bytes: 18316119
num_examples: 210
download_size: 47500480
dataset_size: 54464517
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "kdd210_hourly_96"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmayhem93/random-walk-reddit-corpus-55-cleaned | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3937195661
num_examples: 6141002
download_size: 2309272818
dataset_size: 3937195661
---
# Dataset Card for "random-walk-reddit-corpus-55-cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HNUSYNROVO/data | ---
license: afl-3.0
---
|
greathero/evenmorex6-smaller-newercontrailsvalidationdataset | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 265780521.0
num_examples: 9000
download_size: 257984734
dataset_size: 265780521.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nataliaElv/test_spans_dataset | ---
language:
- en
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for test_spans_dataset
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("nataliaElv/test_spans_dataset")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("nataliaElv/test_spans_dataset")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| prompt | Prompt-(Ents) | text | True | False |
| input | Input-(Ents) | text | True | False |
| input2 | Input-(Info Extraction) | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| prompt-ents | Highlight the entities inside Prompt-(Ents): | span | True | N/A | N/A |
| input-ents | Highlight the entities inside Input-(Ents): | span | True | N/A | N/A |
| info-extraction | Highlight the information inside Input-(Info Extraction) that is relevant to the prompt | span | True | N/A | N/A |
| final-response | Provide a correct response given the prompt and the input: | text | True | Only make the necessary corrections. You can submit the text as it is, if it's correct. | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"input": "Virgin Australia, the trading name of Virgin Australia Airlines Pty Ltd, is an Australian-based airline. It is the largest airline by fleet size to use the Virgin brand. It commenced services on 31 August 2000 as Virgin Blue, with two aircraft on a single route. It suddenly found itself as a major airline in Australia\u0027s domestic market after the collapse of Ansett Australia in September 2001. The airline has since grown to directly serve 32 cities in Australia, from hubs in Brisbane, Melbourne and Sydney.",
"input2": "Virgin Australia, the trading name of Virgin Australia Airlines Pty Ltd, is an Australian-based airline. It is the largest airline by fleet size to use the Virgin brand. It commenced services on 31 August 2000 as Virgin Blue, with two aircraft on a single route. It suddenly found itself as a major airline in Australia\u0027s domestic market after the collapse of Ansett Australia in September 2001. The airline has since grown to directly serve 32 cities in Australia, from hubs in Brisbane, Melbourne and Sydney.",
"prompt": "When did Virgin Australia start operating?"
},
"metadata": {},
"responses": [],
"suggestions": [
{
"agent": null,
"question_name": "prompt-ents",
"score": null,
"type": null,
"value": [
{
"end": 25,
"label": "ORG",
"score": 0.9999854564666748,
"start": 9
}
]
},
{
"agent": null,
"question_name": "input-ents",
"score": null,
"type": null,
"value": [
{
"end": 16,
"label": "ORG",
"score": 0.9998990297317505,
"start": 0
},
{
"end": 71,
"label": "ORG",
"score": 0.9999301433563232,
"start": 38
},
{
"end": 162,
"label": "ORG",
"score": 0.9961417317390442,
"start": 156
},
{
"end": 224,
"label": "ORG",
"score": 0.9999250173568726,
"start": 213
},
{
"end": 319,
"label": "LOC",
"score": 0.9998377561569214,
"start": 310
},
{
"end": 376,
"label": "ORG",
"score": 0.9999576807022095,
"start": 360
},
{
"end": 464,
"label": "LOC",
"score": 0.9998786449432373,
"start": 455
},
{
"end": 487,
"label": "LOC",
"score": 0.9998598098754883,
"start": 479
},
{
"end": 498,
"label": "LOC",
"score": 0.9997498393058777,
"start": 489
},
{
"end": 509,
"label": "LOC",
"score": 0.9998868703842163,
"start": 503
}
]
},
{
"agent": null,
"question_name": "final-response",
"score": null,
"type": null,
"value": "Virgin Australia commenced services on 31 August 2000 as Virgin Blue, with two aircraft on a single route."
}
],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"final-response": [],
"final-response-suggestion": "Virgin Australia commenced services on 31 August 2000 as Virgin Blue, with two aircraft on a single route.",
"final-response-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"info-extraction": [],
"info-extraction-suggestion": null,
"info-extraction-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"input": "Virgin Australia, the trading name of Virgin Australia Airlines Pty Ltd, is an Australian-based airline. It is the largest airline by fleet size to use the Virgin brand. It commenced services on 31 August 2000 as Virgin Blue, with two aircraft on a single route. It suddenly found itself as a major airline in Australia\u0027s domestic market after the collapse of Ansett Australia in September 2001. The airline has since grown to directly serve 32 cities in Australia, from hubs in Brisbane, Melbourne and Sydney.",
"input-ents": [],
"input-ents-suggestion": {
"end": [
16,
71,
162,
224,
319,
376,
464,
487,
498,
509
],
"label": [
"ORG",
"ORG",
"ORG",
"ORG",
"LOC",
"ORG",
"LOC",
"LOC",
"LOC",
"LOC"
],
"score": [
0.9998990297317505,
0.9999301433563232,
0.9961417317390442,
0.9999250173568726,
0.9998377561569214,
0.9999576807022095,
0.9998786449432373,
0.9998598098754883,
0.9997498393058777,
0.9998868703842163
],
"start": [
0,
38,
156,
213,
310,
360,
455,
479,
489,
503
],
"text": [
"Virgin Australia",
"Virgin Australia Airlines Pty Ltd",
"Virgin",
"Virgin Blue",
"Australia",
"Ansett Australia",
"Australia",
"Brisbane",
"Melbourne",
"Sydney"
]
},
"input-ents-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"input2": "Virgin Australia, the trading name of Virgin Australia Airlines Pty Ltd, is an Australian-based airline. It is the largest airline by fleet size to use the Virgin brand. It commenced services on 31 August 2000 as Virgin Blue, with two aircraft on a single route. It suddenly found itself as a major airline in Australia\u0027s domestic market after the collapse of Ansett Australia in September 2001. The airline has since grown to directly serve 32 cities in Australia, from hubs in Brisbane, Melbourne and Sydney.",
"metadata": "{}",
"prompt": "When did Virgin Australia start operating?",
"prompt-ents": [],
"prompt-ents-suggestion": {
"end": [
25
],
"label": [
"ORG"
],
"score": [
0.9999854564666748
],
"start": [
9
],
"text": [
"Virgin Australia"
]
},
"prompt-ents-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **prompt** is of type `text`.
* **input** is of type `text`.
* **input2** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **prompt-ents** is of type `span`.
* **input-ents** is of type `span`.
* **info-extraction** is of type `span`.
* **final-response** is of type `text`, and description "Only make the necessary corrections. You can submit the text as it is, if it's correct.".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **prompt-ents-suggestion** is of type `span`.
* (optional) **input-ents-suggestion** is of type `span`.
* (optional) **info-extraction-suggestion** is of type `span`.
* (optional) **final-response-suggestion** is of type `text`.
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
This is a subset of the Dolly dataset with prompts classified as being Closed QA or Information Extractions tasks.
In the record, you will find the prompt and the input of the task. In the first two fields, you will need to highlight and classify all entities found in the prompt and the input. These are marked as (Ents) for easier recognition.
The input field is then repeated as "Input-(Info Extraction)". Using the "Relevant Info" tag, highlight all pieces of information in the input that are relevant to answer the prompt.
Finally, you will be asked to provide a correct response following the prompt and the given input. You may submit the text as it is, if it's correct, or make any necessary amendments.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
orzhan/minecraft-captioning | ---
license: mit
task_categories:
- image-to-text
language:
- en
size_categories:
- n<1K
--- |
zydxn77/zydxn | ---
license: mit
---
|
CyberHarem/miyauchi_hikage_nonnonbiyori | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Miyauchi Hikage
This is the dataset of Miyauchi Hikage, containing 192 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 192 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 446 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 497 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 192 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 192 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 192 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 446 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 446 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 393 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 497 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 497 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
open-llm-leaderboard/details_ed001__datascience-coder-6.7b | ---
pretty_name: Evaluation run of ed001/datascience-coder-6.7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ed001/datascience-coder-6.7b](https://huggingface.co/ed001/datascience-coder-6.7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ed001__datascience-coder-6.7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T09:33:35.006022](https://huggingface.co/datasets/open-llm-leaderboard/details_ed001__datascience-coder-6.7b/blob/main/results_2024-01-05T09-33-35.006022.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38026416351025355,\n\
\ \"acc_stderr\": 0.03435235823130946,\n \"acc_norm\": 0.38170571467795217,\n\
\ \"acc_norm_stderr\": 0.03507989456880972,\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.44821795300523526,\n\
\ \"mc2_stderr\": 0.01501348980684818\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3430034129692833,\n \"acc_stderr\": 0.01387242322371817,\n\
\ \"acc_norm\": 0.3464163822525597,\n \"acc_norm_stderr\": 0.013905011180063242\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41057558255327625,\n\
\ \"acc_stderr\": 0.004909328992915071,\n \"acc_norm\": 0.538338976299542,\n\
\ \"acc_norm_stderr\": 0.00497509105569719\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.030402331445769537,\n\
\ \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.030402331445769537\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745647,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745647\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017087,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017087\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.38064516129032255,\n \"acc_stderr\": 0.02762171783290703,\n \"\
acc_norm\": 0.38064516129032255,\n \"acc_norm_stderr\": 0.02762171783290703\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"\
acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3939393939393939,\n \"acc_stderr\": 0.038154943086889305,\n\
\ \"acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.038154943086889305\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.40404040404040403,\n \"acc_stderr\": 0.034961309720561266,\n \"\
acc_norm\": 0.40404040404040403,\n \"acc_norm_stderr\": 0.034961309720561266\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.41450777202072536,\n \"acc_stderr\": 0.03555300319557672,\n\
\ \"acc_norm\": 0.41450777202072536,\n \"acc_norm_stderr\": 0.03555300319557672\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.024503472557110936,\n\
\ \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.024503472557110936\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478466,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478466\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3779816513761468,\n \"acc_stderr\": 0.020789187066728113,\n \"\
acc_norm\": 0.3779816513761468,\n \"acc_norm_stderr\": 0.020789187066728113\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.031415546294025425,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.031415546294025425\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39705882352941174,\n \"acc_stderr\": 0.03434131164719129,\n \"\
acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.03434131164719129\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4050632911392405,\n \"acc_stderr\": 0.03195514741370674,\n \
\ \"acc_norm\": 0.4050632911392405,\n \"acc_norm_stderr\": 0.03195514741370674\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.45454545454545453,\n \"acc_stderr\": 0.045454545454545456,\n \"\
acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.045454545454545456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.03881891213334382,\n\
\ \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.03881891213334382\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6282051282051282,\n\
\ \"acc_stderr\": 0.03166098891888078,\n \"acc_norm\": 0.6282051282051282,\n\
\ \"acc_norm_stderr\": 0.03166098891888078\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.438058748403576,\n\
\ \"acc_stderr\": 0.017742232238257247,\n \"acc_norm\": 0.438058748403576,\n\
\ \"acc_norm_stderr\": 0.017742232238257247\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468648,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468648\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.028074158947600666,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.028074158947600666\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4340836012861736,\n\
\ \"acc_stderr\": 0.02815023224453559,\n \"acc_norm\": 0.4340836012861736,\n\
\ \"acc_norm_stderr\": 0.02815023224453559\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621348,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621348\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3262411347517731,\n \"acc_stderr\": 0.027968453043563168,\n \
\ \"acc_norm\": 0.3262411347517731,\n \"acc_norm_stderr\": 0.027968453043563168\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27509778357235987,\n\
\ \"acc_stderr\": 0.01140544362099692,\n \"acc_norm\": 0.27509778357235987,\n\
\ \"acc_norm_stderr\": 0.01140544362099692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3088235294117647,\n \"acc_stderr\": 0.018690850273595287,\n \
\ \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.018690850273595287\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4530612244897959,\n \"acc_stderr\": 0.03186785930004128,\n\
\ \"acc_norm\": 0.4530612244897959,\n \"acc_norm_stderr\": 0.03186785930004128\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43781094527363185,\n\
\ \"acc_stderr\": 0.0350808011219984,\n \"acc_norm\": 0.43781094527363185,\n\
\ \"acc_norm_stderr\": 0.0350808011219984\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3567251461988304,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.3567251461988304,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n\
\ \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.44821795300523526,\n\
\ \"mc2_stderr\": 0.01501348980684818\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5572217837411207,\n \"acc_stderr\": 0.013960157350784987\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2494313874147081,\n \
\ \"acc_stderr\": 0.011918265218445523\n }\n}\n```"
repo_url: https://huggingface.co/ed001/datascience-coder-6.7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|arc:challenge|25_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|gsm8k|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hellaswag|10_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T09-33-35.006022.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T09-33-35.006022.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- '**/details_harness|winogrande|5_2024-01-05T09-33-35.006022.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T09-33-35.006022.parquet'
- config_name: results
data_files:
- split: 2024_01_05T09_33_35.006022
path:
- results_2024-01-05T09-33-35.006022.parquet
- split: latest
path:
- results_2024-01-05T09-33-35.006022.parquet
---
# Dataset Card for Evaluation run of ed001/datascience-coder-6.7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ed001/datascience-coder-6.7b](https://huggingface.co/ed001/datascience-coder-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ed001__datascience-coder-6.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T09:33:35.006022](https://huggingface.co/datasets/open-llm-leaderboard/details_ed001__datascience-coder-6.7b/blob/main/results_2024-01-05T09-33-35.006022.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38026416351025355,
"acc_stderr": 0.03435235823130946,
"acc_norm": 0.38170571467795217,
"acc_norm_stderr": 0.03507989456880972,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219373,
"mc2": 0.44821795300523526,
"mc2_stderr": 0.01501348980684818
},
"harness|arc:challenge|25": {
"acc": 0.3430034129692833,
"acc_stderr": 0.01387242322371817,
"acc_norm": 0.3464163822525597,
"acc_norm_stderr": 0.013905011180063242
},
"harness|hellaswag|10": {
"acc": 0.41057558255327625,
"acc_stderr": 0.004909328992915071,
"acc_norm": 0.538338976299542,
"acc_norm_stderr": 0.00497509105569719
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.030783736757745647,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.030783736757745647
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220554,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220554
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017087,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017087
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38064516129032255,
"acc_stderr": 0.02762171783290703,
"acc_norm": 0.38064516129032255,
"acc_norm_stderr": 0.02762171783290703
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3939393939393939,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.3939393939393939,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.40404040404040403,
"acc_stderr": 0.034961309720561266,
"acc_norm": 0.40404040404040403,
"acc_norm_stderr": 0.034961309720561266
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.41450777202072536,
"acc_stderr": 0.03555300319557672,
"acc_norm": 0.41450777202072536,
"acc_norm_stderr": 0.03555300319557672
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3779816513761468,
"acc_stderr": 0.020789187066728113,
"acc_norm": 0.3779816513761468,
"acc_norm_stderr": 0.020789187066728113
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.031415546294025425,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.031415546294025425
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4050632911392405,
"acc_stderr": 0.03195514741370674,
"acc_norm": 0.4050632911392405,
"acc_norm_stderr": 0.03195514741370674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.366412213740458,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.366412213740458,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.045454545454545456,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.045454545454545456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.03881891213334382,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.03881891213334382
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.03166098891888078,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.03166098891888078
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.438058748403576,
"acc_stderr": 0.017742232238257247,
"acc_norm": 0.438058748403576,
"acc_norm_stderr": 0.017742232238257247
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468648,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468648
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.028074158947600666,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.028074158947600666
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4340836012861736,
"acc_stderr": 0.02815023224453559,
"acc_norm": 0.4340836012861736,
"acc_norm_stderr": 0.02815023224453559
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621348,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621348
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3262411347517731,
"acc_stderr": 0.027968453043563168,
"acc_norm": 0.3262411347517731,
"acc_norm_stderr": 0.027968453043563168
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27509778357235987,
"acc_stderr": 0.01140544362099692,
"acc_norm": 0.27509778357235987,
"acc_norm_stderr": 0.01140544362099692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.375,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.375,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.018690850273595287,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.018690850273595287
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4530612244897959,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.4530612244897959,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43781094527363185,
"acc_stderr": 0.0350808011219984,
"acc_norm": 0.43781094527363185,
"acc_norm_stderr": 0.0350808011219984
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3567251461988304,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.3567251461988304,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219373,
"mc2": 0.44821795300523526,
"mc2_stderr": 0.01501348980684818
},
"harness|winogrande|5": {
"acc": 0.5572217837411207,
"acc_stderr": 0.013960157350784987
},
"harness|gsm8k|5": {
"acc": 0.2494313874147081,
"acc_stderr": 0.011918265218445523
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tasksource/lsat-rc | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: context
dtype: string
- name: id_string
dtype: string
- name: answers
sequence: string
- name: label
dtype: int64
- name: question
dtype: string
splits:
- name: validation
num_bytes: 982698
num_examples: 270
- name: train
num_bytes: 6676505
num_examples: 1827
- name: test
num_bytes: 978997
num_examples: 269
download_size: 1474121
dataset_size: 8638200
---
# Dataset Card for "lsat-rc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
theoracle/commodore64_program_latest | ---
license: apache-2.0
---
|
jstack32/SampleDataset | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 102
num_examples: 2
download_size: 1893
dataset_size: 102
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
---
# Dataset Card for "SampleDataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/Hatefulmemes_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': not-hateful
'1': hateful
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B
sequence: string
- name: blip_caption_beam_5
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_LAION-ViT-H-14-2B
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: blip_caption_topk_50_Salesforce_blip_image_captioning_large_multiple
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_all_patches
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: blip_caption_beam_5_Salesforce_blip_image_captioning_large
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai_wordnet
sequence: string
- name: blip_caption_5_Salesforce_blip_image_captioning_large_hf
dtype: string
- name: blip_caption_5_Salesforce_blip_image_captioning_large_hf_a meme of
dtype: string
- name: blip_caption_5_Salesforce_blip_image_captioning_large_max_length_30_hf
dtype: string
- name: blip_caption_5_Salesforce_blip_image_captioning_large_max_length_200_hf
dtype: string
- name: blip_caption_5_Salesforce_blip_image_captioning_large_max_length_200_hf_a
meme of
dtype: string
- name: blip_caption_False_beams_5_Salesforce_blip_image_captioning_large_max_length_30_hf
dtype: string
- name: blip_caption_beam_False_5_source
dtype: string
- name: 'blip_caption_False_beams_5_base_capfilt_large_max_length_30_source_a pitcure
of '
dtype: string
- name: 'blip_caption_False_beams_5_base_capfilt_large_max_length_100_source_a pitcure
of '
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_ViT_L_14_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_laion.pt
sequence: string
splits:
- name: test
num_bytes: 421626763.0
num_examples: 1000
download_size: 387589337
dataset_size: 421626763.0
---
# Dataset Card for "Hatefulmemes_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nielzac/Graph2Text_rdf_type | ---
license: apache-2.0
language:
- en
tags:
- graph
--- |
L4IO/tota_gep_arpad | ---
license: gfdl
task_categories:
- text-generation
language:
- hu
pretty_name: tota_gep_arpad
size_categories:
- n<1K
--- |
sms1097/self_rag_tokens_train_data | ---
language:
- en
license: mit
pretty_name: f
dataset_info:
features:
- name: instruction
dtype: string
- name: retrieval
dtype: string
- name: doc
dtype: string
- name: relevant
dtype: string
- name: answer
dtype: string
- name: support
dtype: string
- name: utility
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 49969679
num_examples: 79132
download_size: 29115294
dataset_size: 49969679
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Self-Rag Tokens Dataset
This dataset is a spin off of the work from [Self-Rag training data](https://huggingface.co/datasets/selfrag/selfrag_train_data).
In Self-RAG, the authors show how a LLM can be trained to predict tokens for retrieval, if the context is relevant/irrelevant, if the answer is supported, and how useful the response is.
The limitation of Self-RAG is that you must train the LLM on this task, which can be tricky or cost prohibitive. With rapid developments in LLM performance, investing in training one LLM with Self-RAG may not be worthwhile when a new model is available quite quickly.
We propose a new task with this dataset, using the instruction, context, and generated answer, have separate classification models that can predict these tokens.
This allows you to have a more flexible system where you control which LLM is available and when the reflection tokens are generated.
### Token Review
Here are the tokens you can use for training:
- Retrieve: (Decides whether a doc is needed to generate an answer to the instruction)
- [No Retrieval] 51015
- [Retrieval] 28117
- Relevant (doc provides useful information to solve x)
- [Relevant] 24251
- [Irrelevant] 3866
- Support (All of the verification-worthy statement in answer is supported by doc)
- [Fully supported] 19170
- [Partially supported] 3259
- [No support / Contradictory] 1822
- Utility (answer is a useful response to instruction)
- [Utility:5] 65774
- [Utility:4] 6387
- [Utility:2] 4300
- [Utility:1] 2601
- [Utility:3] 70
|
Falah/blonde_woman_photography_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 98527
num_examples: 1000
download_size: 1673
dataset_size: 98527
---
# Dataset Card for "blonde_woman_photography_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oyemade/test-yoruba-tts | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 10930138.0
num_examples: 8
download_size: 7772826
dataset_size: 10930138.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bigscience/xP3megds | ---
annotations_creators:
- expert-generated
- crowdsourced
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zu
programming_language:
- C
- C++
- C#
- Go
- Java
- JavaScript
- Lua
- PHP
- Python
- Ruby
- Rust
- Scala
- TypeScript
license:
- apache-2.0
multilinguality:
- multilingual
pretty_name: xP3
size_categories:
- 100M<n<1B
task_categories:
- other
---
# Dataset Card for xP3
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/bigscience-workshop/xmtf
- **Paper:** [Crosslingual Generalization through Multitask Finetuning](https://arxiv.org/abs/2211.01786)
- **Point of Contact:** [Niklas Muennighoff](mailto:niklas@hf.co)
### Dataset Summary
> xP3 (Crosslingual Public Pool of Prompts) is a collection of prompts & datasets across 46 of languages & 16 NLP tasks. It is used for the training of BLOOMZ and mT0, multilingual language models capable of following human instructions in dozens of languages zero-shot.
- **Creation:** The dataset can be recreated using instructions available [here](https://github.com/bigscience-workshop/xmtf#create-xp3). We provide this version to save processing time and ease reproducibility.
- **Languages:** 46 (Can be extended by [recreating with more splits](https://github.com/bigscience-workshop/xmtf#create-xp3))
- **xP3 Dataset Family:**
<table>
<tr>
<th>Name</th>
<th>Explanation</th>
<th>Example models</th>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/Muennighoff/xP3x>xP3x</a></t>
<td>Mixture of 17 tasks in 277 languages with English prompts</td>
<td>WIP - Join us at Project Aya @<a href=https://cohere.for.ai/>C4AI</a> to help!</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3>xP3</a></t>
<td>Mixture of 13 training tasks in 46 languages with English prompts</td>
<td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a> & <a href=https://huggingface.co/bigscience/mt0-xxl>mt0-xxl</a></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3mt>xP3mt</a></t>
<td>Mixture of 13 training tasks in 46 languages with prompts in 20 languages (machine-translated from English)</td>
<td><a href=https://huggingface.co/bigscience/bloomz-mt>bloomz-mt</a> & <a href=https://huggingface.co/bigscience/mt0-xxl-mt>mt0-xxl-mt</a></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3all>xP3all</a></t>
<td>xP3 + evaluation datasets adding an additional 3 tasks for a total of 16 tasks in 46 languages with English prompts</td>
<td></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/bigscience/xP3megds>xP3megds</a></t>
<td><a href=https://github.com/bigscience-workshop/Megatron-DeepSpeed>Megatron-DeepSpeed</a> processed version of xP3</td>
<td><a href=https://huggingface.co/bigscience/bloomz>bloomz</a></td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/Muennighoff/P3>P3</a></t>
<td>Repreprocessed version of the English-only <a href=https://huggingface.co/datasets/bigscience/P3>P3</a> with 8 training tasks</td>
<td><a href=https://huggingface.co/bigscience/bloomz-p3>bloomz-p3</a> & <a href=https://huggingface.co/bigscience/mt0-xxl-p3>mt0-xxl-p3</a></td>
</tr>
</table>
## Dataset Structure
### Data Instances
An example of "train" looks as follows:
```json
{
"inputs": "Sentence 1: Fue académico en literatura metafísica, teología y ciencias clásicas.\nSentence 2: Fue académico en literatura metafísica, teología y ciencia clásica.\nQuestion: Can we rewrite Sentence 1 to Sentence 2? Yes or No?",
"targets": "Yes"
}
```
### Data Fields
The data fields are the same among all splits:
- `inputs`: the natural language input fed to the model
- `targets`: the natural language target that the model has to generate
### Data Splits
The below table summarizes sizes per language (computed from the `merged_{lang}.jsonl` files). Due to languages like `tw` only being single sentence translation samples from Flores, their byte percentage is significantly lower than their sample percentage.
|Language|Kilobytes|%|Samples|%|
|--------|------:|-:|---:|-:|
|tw|106288|0.11|265071|0.34|
|bm|107056|0.11|265180|0.34|
|ak|108096|0.11|265071|0.34|
|eu|108112|0.11|269973|0.34|
|ca|110608|0.12|271191|0.34|
|fon|113072|0.12|265063|0.34|
|st|114080|0.12|265063|0.34|
|ki|115040|0.12|265180|0.34|
|tum|116032|0.12|265063|0.34|
|wo|122560|0.13|365063|0.46|
|ln|126304|0.13|365060|0.46|
|as|156256|0.16|265063|0.34|
|or|161472|0.17|265063|0.34|
|kn|165456|0.17|265063|0.34|
|ml|175040|0.18|265864|0.34|
|rn|192992|0.2|318189|0.4|
|nso|229712|0.24|915051|1.16|
|tn|235536|0.25|915054|1.16|
|lg|235936|0.25|915021|1.16|
|rw|249360|0.26|915043|1.16|
|ts|250256|0.26|915044|1.16|
|sn|252496|0.27|865056|1.1|
|xh|254672|0.27|915058|1.16|
|zu|263712|0.28|915061|1.16|
|ny|272128|0.29|915063|1.16|
|ig|325232|0.34|950097|1.2|
|yo|352784|0.37|918416|1.16|
|ne|393680|0.41|315754|0.4|
|pa|523248|0.55|339210|0.43|
|gu|560688|0.59|347499|0.44|
|sw|560896|0.59|1114455|1.41|
|mr|666240|0.7|417269|0.53|
|bn|832720|0.88|428843|0.54|
|ta|924496|0.97|410633|0.52|
|te|1332912|1.4|573364|0.73|
|ur|1918272|2.02|855756|1.08|
|vi|3101408|3.27|1667306|2.11|
|code|4330752|4.56|2707724|3.43|
|hi|4393696|4.63|1543441|1.96|
|zh|4589904|4.83|3560556|4.51|
|id|4606288|4.85|2627392|3.33|
|ar|4677264|4.93|2148955|2.72|
|fr|5546688|5.84|5055942|6.41|
|pt|6129584|6.46|3562772|4.52|
|es|7571808|7.98|5151349|6.53|
|en|37261104|39.25|31495184|39.93|
|total|94941936|100.0|78883588|100.0|
## Dataset Creation
### Source Data
#### Training datasets
- Code Miscellaneous
- [CodeComplex](https://huggingface.co/datasets/codeparrot/codecomplex)
- [Docstring Corpus](https://huggingface.co/datasets/teven/code_docstring_corpus)
- [GreatCode](https://huggingface.co/datasets/great_code)
- [State Changes](https://huggingface.co/datasets/Fraser/python-state-changes)
- Closed-book QA
- [Hotpot QA](https://huggingface.co/datasets/hotpot_qa)
- [Trivia QA](https://huggingface.co/datasets/trivia_qa)
- [Web Questions](https://huggingface.co/datasets/web_questions)
- [Wiki QA](https://huggingface.co/datasets/wiki_qa)
- Extractive QA
- [Adversarial QA](https://huggingface.co/datasets/adversarial_qa)
- [CMRC2018](https://huggingface.co/datasets/cmrc2018)
- [DRCD](https://huggingface.co/datasets/clue)
- [DuoRC](https://huggingface.co/datasets/duorc)
- [MLQA](https://huggingface.co/datasets/mlqa)
- [Quoref](https://huggingface.co/datasets/quoref)
- [ReCoRD](https://huggingface.co/datasets/super_glue)
- [ROPES](https://huggingface.co/datasets/ropes)
- [SQuAD v2](https://huggingface.co/datasets/squad_v2)
- [xQuAD](https://huggingface.co/datasets/xquad)
- TyDI QA
- [Primary](https://huggingface.co/datasets/khalidalt/tydiqa-primary)
- [Goldp](https://huggingface.co/datasets/khalidalt/tydiqa-goldp)
- Multiple-Choice QA
- [ARC](https://huggingface.co/datasets/ai2_arc)
- [C3](https://huggingface.co/datasets/c3)
- [CoS-E](https://huggingface.co/datasets/cos_e)
- [Cosmos](https://huggingface.co/datasets/cosmos)
- [DREAM](https://huggingface.co/datasets/dream)
- [MultiRC](https://huggingface.co/datasets/super_glue)
- [OpenBookQA](https://huggingface.co/datasets/openbookqa)
- [PiQA](https://huggingface.co/datasets/piqa)
- [QUAIL](https://huggingface.co/datasets/quail)
- [QuaRel](https://huggingface.co/datasets/quarel)
- [QuaRTz](https://huggingface.co/datasets/quartz)
- [QASC](https://huggingface.co/datasets/qasc)
- [RACE](https://huggingface.co/datasets/race)
- [SciQ](https://huggingface.co/datasets/sciq)
- [Social IQA](https://huggingface.co/datasets/social_i_qa)
- [Wiki Hop](https://huggingface.co/datasets/wiki_hop)
- [WiQA](https://huggingface.co/datasets/wiqa)
- Paraphrase Identification
- [MRPC](https://huggingface.co/datasets/super_glue)
- [PAWS](https://huggingface.co/datasets/paws)
- [PAWS-X](https://huggingface.co/datasets/paws-x)
- [QQP](https://huggingface.co/datasets/qqp)
- Program Synthesis
- [APPS](https://huggingface.co/datasets/codeparrot/apps)
- [CodeContests](https://huggingface.co/datasets/teven/code_contests)
- [JupyterCodePairs](https://huggingface.co/datasets/codeparrot/github-jupyter-text-code-pairs)
- [MBPP](https://huggingface.co/datasets/Muennighoff/mbpp)
- [NeuralCodeSearch](https://huggingface.co/datasets/neural_code_search)
- [XLCoST](https://huggingface.co/datasets/codeparrot/xlcost-text-to-code)
- Structure-to-text
- [Common Gen](https://huggingface.co/datasets/common_gen)
- [Wiki Bio](https://huggingface.co/datasets/wiki_bio)
- Sentiment
- [Amazon](https://huggingface.co/datasets/amazon_polarity)
- [App Reviews](https://huggingface.co/datasets/app_reviews)
- [IMDB](https://huggingface.co/datasets/imdb)
- [Rotten Tomatoes](https://huggingface.co/datasets/rotten_tomatoes)
- [Yelp](https://huggingface.co/datasets/yelp_review_full)
- Simplification
- [BiSECT](https://huggingface.co/datasets/GEM/BiSECT)
- Summarization
- [CNN Daily Mail](https://huggingface.co/datasets/cnn_dailymail)
- [Gigaword](https://huggingface.co/datasets/gigaword)
- [MultiNews](https://huggingface.co/datasets/multi_news)
- [SamSum](https://huggingface.co/datasets/samsum)
- [Wiki-Lingua](https://huggingface.co/datasets/GEM/wiki_lingua)
- [XLSum](https://huggingface.co/datasets/GEM/xlsum)
- [XSum](https://huggingface.co/datasets/xsum)
- Topic Classification
- [AG News](https://huggingface.co/datasets/ag_news)
- [DBPedia](https://huggingface.co/datasets/dbpedia_14)
- [TNEWS](https://huggingface.co/datasets/clue)
- [TREC](https://huggingface.co/datasets/trec)
- [CSL](https://huggingface.co/datasets/clue)
- Translation
- [Flores-200](https://huggingface.co/datasets/Muennighoff/flores200)
- [Tatoeba](https://huggingface.co/datasets/Helsinki-NLP/tatoeba_mt)
- Word Sense disambiguation
- [WiC](https://huggingface.co/datasets/super_glue)
- [XL-WiC](https://huggingface.co/datasets/pasinit/xlwic)
#### Evaluation datasets (included in [xP3all](https://huggingface.co/datasets/bigscience/xP3all) except for NLI & HumanEval)
- Natural Language Inference (NLI)
- [ANLI](https://huggingface.co/datasets/anli)
- [CB](https://huggingface.co/datasets/super_glue)
- [RTE](https://huggingface.co/datasets/super_glue)
- [XNLI](https://huggingface.co/datasets/xnli)
- Coreference Resolution
- [Winogrande](https://huggingface.co/datasets/winogrande)
- [XWinograd](https://huggingface.co/datasets/Muennighoff/xwinograd)
- Program Synthesis
- [HumanEval](https://huggingface.co/datasets/openai_humaneval)
- Sentence Completion
- [COPA](https://huggingface.co/datasets/super_glue)
- [Story Cloze](https://huggingface.co/datasets/story_cloze)
- [XCOPA](https://huggingface.co/datasets/xcopa)
- [XStoryCloze](https://huggingface.co/datasets/Muennighoff/xstory_cloze)
## Additional Information
### Licensing Information
The dataset is released under Apache 2.0.
### Citation Information
```bibtex
@misc{muennighoff2022crosslingual,
title={Crosslingual Generalization through Multitask Finetuning},
author={Niklas Muennighoff and Thomas Wang and Lintang Sutawika and Adam Roberts and Stella Biderman and Teven Le Scao and M Saiful Bari and Sheng Shen and Zheng-Xin Yong and Hailey Schoelkopf and Xiangru Tang and Dragomir Radev and Alham Fikri Aji and Khalid Almubarak and Samuel Albanie and Zaid Alyafeai and Albert Webson and Edward Raff and Colin Raffel},
year={2022},
eprint={2211.01786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to the contributors of [promptsource](https://github.com/bigscience-workshop/promptsource/graphs/contributors) for adding many prompts used in this dataset. |
open-llm-leaderboard/details_google__gemma-2b | ---
pretty_name: Evaluation run of google/gemma-2b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [google/gemma-2b](https://huggingface.co/google/gemma-2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_google__gemma-2b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T14:01:00.018926](https://huggingface.co/datasets/open-llm-leaderboard/details_google__gemma-2b/blob/main/results_2024-02-22T14-01-00.018926.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.42038680434244324,\n\
\ \"acc_stderr\": 0.034510117646544144,\n \"acc_norm\": 0.4239350438422939,\n\
\ \"acc_norm_stderr\": 0.03527186768975284,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757442,\n \"mc2\": 0.3308443428097746,\n\
\ \"mc2_stderr\": 0.013470093983653904\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46928327645051193,\n \"acc_stderr\": 0.014583792546304038,\n\
\ \"acc_norm\": 0.48378839590443684,\n \"acc_norm_stderr\": 0.014603708567414936\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5315674168492333,\n\
\ \"acc_stderr\": 0.004979826829400772,\n \"acc_norm\": 0.7176857199761004,\n\
\ \"acc_norm_stderr\": 0.00449205527940711\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.040260970832965585,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.040260970832965585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4679245283018868,\n \"acc_stderr\": 0.030709486992556545,\n\
\ \"acc_norm\": 0.4679245283018868,\n \"acc_norm_stderr\": 0.030709486992556545\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.14705882352941177,\n \"acc_stderr\": 0.035240689515674474,\n\
\ \"acc_norm\": 0.14705882352941177,\n \"acc_norm_stderr\": 0.035240689515674474\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4068965517241379,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.4068965517241379,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525208,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4838709677419355,\n\
\ \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.4838709677419355,\n\
\ \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398395,\n\
\ \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5050505050505051,\n \"acc_stderr\": 0.035621707606254015,\n \"\
acc_norm\": 0.5050505050505051,\n \"acc_norm_stderr\": 0.035621707606254015\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5906735751295337,\n \"acc_stderr\": 0.03548608168860806,\n\
\ \"acc_norm\": 0.5906735751295337,\n \"acc_norm_stderr\": 0.03548608168860806\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.41025641025641024,\n \"acc_stderr\": 0.024939313906940784,\n\
\ \"acc_norm\": 0.41025641025641024,\n \"acc_norm_stderr\": 0.024939313906940784\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3865546218487395,\n \"acc_stderr\": 0.03163145807552379,\n \
\ \"acc_norm\": 0.3865546218487395,\n \"acc_norm_stderr\": 0.03163145807552379\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5724770642201835,\n \"acc_stderr\": 0.021210910204300437,\n \"\
acc_norm\": 0.5724770642201835,\n \"acc_norm_stderr\": 0.021210910204300437\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.03256850570293648,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.03256850570293648\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4411764705882353,\n \"acc_stderr\": 0.034849415144292316,\n \"\
acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.034849415144292316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.39662447257383965,\n \"acc_stderr\": 0.03184399873811225,\n \
\ \"acc_norm\": 0.39662447257383965,\n \"acc_norm_stderr\": 0.03184399873811225\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4439461883408072,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.4439461883408072,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356462,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356462\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4110429447852761,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.4110429447852761,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6068376068376068,\n\
\ \"acc_stderr\": 0.03199957924651047,\n \"acc_norm\": 0.6068376068376068,\n\
\ \"acc_norm_stderr\": 0.03199957924651047\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5466155810983397,\n\
\ \"acc_stderr\": 0.017802087135850304,\n \"acc_norm\": 0.5466155810983397,\n\
\ \"acc_norm_stderr\": 0.017802087135850304\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.026720034380514995,\n\
\ \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.026720034380514995\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103982,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103982\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4673202614379085,\n \"acc_stderr\": 0.02856869975222588,\n\
\ \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.02856869975222588\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4115755627009646,\n\
\ \"acc_stderr\": 0.02795048149440126,\n \"acc_norm\": 0.4115755627009646,\n\
\ \"acc_norm_stderr\": 0.02795048149440126\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.027744313443376536,\n\
\ \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.027744313443376536\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3474576271186441,\n\
\ \"acc_stderr\": 0.0121614177297498,\n \"acc_norm\": 0.3474576271186441,\n\
\ \"acc_norm_stderr\": 0.0121614177297498\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3741830065359477,\n \"acc_stderr\": 0.019576953122088847,\n \
\ \"acc_norm\": 0.3741830065359477,\n \"acc_norm_stderr\": 0.019576953122088847\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46122448979591835,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.46122448979591835,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.42786069651741293,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.42786069651741293,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.03820042586602967,\n\
\ \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.03820042586602967\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757442,\n \"mc2\": 0.3308443428097746,\n\
\ \"mc2_stderr\": 0.013470093983653904\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6629834254143646,\n \"acc_stderr\": 0.013284955769395248\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16906747536012132,\n \
\ \"acc_stderr\": 0.010324171445497358\n }\n}\n```"
repo_url: https://huggingface.co/google/gemma-2b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|arc:challenge|25_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|arc:challenge|25_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|arc:challenge|25_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|gsm8k|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|gsm8k|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|gsm8k|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hellaswag|10_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hellaswag|10_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hellaswag|10_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T17-31-49.393135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T08-30-11.614561.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-01-00.018926.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T14-01-00.018926.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- '**/details_harness|winogrande|5_2024-02-15T17-31-49.393135.parquet'
- split: 2024_02_16T08_30_11.614561
path:
- '**/details_harness|winogrande|5_2024-02-16T08-30-11.614561.parquet'
- split: 2024_02_22T14_01_00.018926
path:
- '**/details_harness|winogrande|5_2024-02-22T14-01-00.018926.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T14-01-00.018926.parquet'
- config_name: results
data_files:
- split: 2024_02_15T17_31_49.393135
path:
- results_2024-02-15T17-31-49.393135.parquet
- split: 2024_02_16T08_30_11.614561
path:
- results_2024-02-16T08-30-11.614561.parquet
- split: 2024_02_22T14_01_00.018926
path:
- results_2024-02-22T14-01-00.018926.parquet
- split: latest
path:
- results_2024-02-22T14-01-00.018926.parquet
---
# Dataset Card for Evaluation run of google/gemma-2b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [google/gemma-2b](https://huggingface.co/google/gemma-2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_google__gemma-2b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T14:01:00.018926](https://huggingface.co/datasets/open-llm-leaderboard/details_google__gemma-2b/blob/main/results_2024-02-22T14-01-00.018926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.42038680434244324,
"acc_stderr": 0.034510117646544144,
"acc_norm": 0.4239350438422939,
"acc_norm_stderr": 0.03527186768975284,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757442,
"mc2": 0.3308443428097746,
"mc2_stderr": 0.013470093983653904
},
"harness|arc:challenge|25": {
"acc": 0.46928327645051193,
"acc_stderr": 0.014583792546304038,
"acc_norm": 0.48378839590443684,
"acc_norm_stderr": 0.014603708567414936
},
"harness|hellaswag|10": {
"acc": 0.5315674168492333,
"acc_stderr": 0.004979826829400772,
"acc_norm": 0.7176857199761004,
"acc_norm_stderr": 0.00449205527940711
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.040260970832965585,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.040260970832965585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4679245283018868,
"acc_stderr": 0.030709486992556545,
"acc_norm": 0.4679245283018868,
"acc_norm_stderr": 0.030709486992556545
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.14705882352941177,
"acc_stderr": 0.035240689515674474,
"acc_norm": 0.14705882352941177,
"acc_norm_stderr": 0.035240689515674474
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4068965517241379,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.4068965517241379,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525208,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4838709677419355,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.4838709677419355,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5050505050505051,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.5050505050505051,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5906735751295337,
"acc_stderr": 0.03548608168860806,
"acc_norm": 0.5906735751295337,
"acc_norm_stderr": 0.03548608168860806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41025641025641024,
"acc_stderr": 0.024939313906940784,
"acc_norm": 0.41025641025641024,
"acc_norm_stderr": 0.024939313906940784
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3865546218487395,
"acc_stderr": 0.03163145807552379,
"acc_norm": 0.3865546218487395,
"acc_norm_stderr": 0.03163145807552379
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5724770642201835,
"acc_stderr": 0.021210910204300437,
"acc_norm": 0.5724770642201835,
"acc_norm_stderr": 0.021210910204300437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.03256850570293648,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.03256850570293648
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.034849415144292316,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.034849415144292316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.39662447257383965,
"acc_stderr": 0.03184399873811225,
"acc_norm": 0.39662447257383965,
"acc_norm_stderr": 0.03184399873811225
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4439461883408072,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.4439461883408072,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356462,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356462
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4110429447852761,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.4110429447852761,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6068376068376068,
"acc_stderr": 0.03199957924651047,
"acc_norm": 0.6068376068376068,
"acc_norm_stderr": 0.03199957924651047
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5466155810983397,
"acc_stderr": 0.017802087135850304,
"acc_norm": 0.5466155810983397,
"acc_norm_stderr": 0.017802087135850304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.026720034380514995,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.026720034380514995
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103982,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103982
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.02856869975222588,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.02856869975222588
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4115755627009646,
"acc_stderr": 0.02795048149440126,
"acc_norm": 0.4115755627009646,
"acc_norm_stderr": 0.02795048149440126
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3474576271186441,
"acc_stderr": 0.0121614177297498,
"acc_norm": 0.3474576271186441,
"acc_norm_stderr": 0.0121614177297498
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3741830065359477,
"acc_stderr": 0.019576953122088847,
"acc_norm": 0.3741830065359477,
"acc_norm_stderr": 0.019576953122088847
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46122448979591835,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.46122448979591835,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.42786069651741293,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.42786069651741293,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.03820042586602967,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.03820042586602967
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757442,
"mc2": 0.3308443428097746,
"mc2_stderr": 0.013470093983653904
},
"harness|winogrande|5": {
"acc": 0.6629834254143646,
"acc_stderr": 0.013284955769395248
},
"harness|gsm8k|5": {
"acc": 0.16906747536012132,
"acc_stderr": 0.010324171445497358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Harshithacj123/cireco_chat_abstracts | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 22152
num_examples: 50
download_size: 9554
dataset_size: 22152
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cireco_chat_abstracts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jtatman/textbooks-are-all-you-need-lite-instruct | ---
dataset_info:
features:
- name: system
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2851916880
num_examples: 681845
download_size: 1231543746
dataset_size: 2851916880
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lbuk/alpaca_data_pl.json | ---
license: agpl-3.0
---
|
tyzhu/wiki_find_passage_train10_eval40_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 47708
num_examples: 60
- name: validation
num_bytes: 33332
num_examples: 40
download_size: 66671
dataset_size: 81040
---
# Dataset Card for "wiki_find_passage_train10_eval40_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
W1lson/testt | ---
dataset_info:
features:
- name: Category
dtype: string
- name: Description
dtype: string
splits:
- name: train
num_bytes: 4499
num_examples: 100
download_size: 3168
dataset_size: 4499
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "testt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_rare_v5_full | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7297958
num_examples: 5070
- name: validation
num_bytes: 345326
num_examples: 300
download_size: 0
dataset_size: 7643284
---
# Dataset Card for "squad_qa_rare_v5_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb-pt/tweet_sentiment_extraction | ---
configs:
- config_name: pt-br
data_files:
- split: train
path: train*
- split: test
path: test*
--- |
ChanceFocus/flare-ectsum | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: label
sequence: int64
- name: text
dtype: string
splits:
- name: test
num_bytes: 7121761
num_examples: 495
download_size: 3357696
dataset_size: 7121761
---
# Dataset Card for "flare-ectsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ashwincv0112/SAS_Python_Conversion | ---
dataset_info:
features:
- name: SAS Code
dtype: string
- name: Converted Python Code
dtype: string
splits:
- name: train
num_bytes: 6362
num_examples: 30
download_size: 5247
dataset_size: 6362
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SAS_Python_Conversion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ENSEONG/jungdae | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 231777
num_examples: 135
download_size: 101263
dataset_size: 231777
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "jungdae"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kanishka/counterfactual_babylm_measure_nps_as_singular | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581819977
num_examples: 11668069
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 421729059
dataset_size: 637940207
---
# Dataset Card for "counterfactual_babylm_measure_nps_as_singular"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oliveirabruno01/shaped-svgs-small-unlabeled | ---
task_categories:
- text-generation
- text-to-image
- text2text-generation
language:
- en
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1 | ---
pretty_name: Evaluation run of VMware/open-llama-0.7T-7B-open-instruct-v1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [VMware/open-llama-0.7T-7B-open-instruct-v1.1](https://huggingface.co/VMware/open-llama-0.7T-7B-open-instruct-v1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T16:28:57.992845](https://huggingface.co/datasets/open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1/blob/main/results_2023-09-22T16-28-57.992845.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23406040268456377,\n\
\ \"em_stderr\": 0.004336115943633415,\n \"f1\": 0.28612730704698025,\n\
\ \"f1_stderr\": 0.004340090005641948,\n \"acc\": 0.3309415003712961,\n\
\ \"acc_stderr\": 0.007877939232005797\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.23406040268456377,\n \"em_stderr\": 0.004336115943633415,\n\
\ \"f1\": 0.28612730704698025,\n \"f1_stderr\": 0.004340090005641948\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \
\ \"acc_stderr\": 0.002389281512077218\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.013366596951934375\n\
\ }\n}\n```"
repo_url: https://huggingface.co/VMware/open-llama-0.7T-7B-open-instruct-v1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T16_28_57.992845
path:
- '**/details_harness|drop|3_2023-09-22T16-28-57.992845.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T16-28-57.992845.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T16_28_57.992845
path:
- '**/details_harness|gsm8k|5_2023-09-22T16-28-57.992845.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T16-28-57.992845.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:57:28.493539.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:57:28.493539.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:57:28.493539.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T16_28_57.992845
path:
- '**/details_harness|winogrande|5_2023-09-22T16-28-57.992845.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T16-28-57.992845.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_57_28.493539
path:
- results_2023-07-19T16:57:28.493539.parquet
- split: 2023_09_22T16_28_57.992845
path:
- results_2023-09-22T16-28-57.992845.parquet
- split: latest
path:
- results_2023-09-22T16-28-57.992845.parquet
---
# Dataset Card for Evaluation run of VMware/open-llama-0.7T-7B-open-instruct-v1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/VMware/open-llama-0.7T-7B-open-instruct-v1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [VMware/open-llama-0.7T-7B-open-instruct-v1.1](https://huggingface.co/VMware/open-llama-0.7T-7B-open-instruct-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T16:28:57.992845](https://huggingface.co/datasets/open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1/blob/main/results_2023-09-22T16-28-57.992845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.23406040268456377,
"em_stderr": 0.004336115943633415,
"f1": 0.28612730704698025,
"f1_stderr": 0.004340090005641948,
"acc": 0.3309415003712961,
"acc_stderr": 0.007877939232005797
},
"harness|drop|3": {
"em": 0.23406040268456377,
"em_stderr": 0.004336115943633415,
"f1": 0.28612730704698025,
"f1_stderr": 0.004340090005641948
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.002389281512077218
},
"harness|winogrande|5": {
"acc": 0.654301499605367,
"acc_stderr": 0.013366596951934375
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
karanravindra/QMNIST | ---
size_categories:
- 10K<n<100K
---
|
ammaralam/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gradio/chatinterface_with_image_csv | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
luzco13/leonardo | ---
license: openrail
---
|
deetsadi/processed_dwi_sobel_with_adc_cropped | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 31198435.0
num_examples: 200
download_size: 31139462
dataset_size: 31198435.0
---
# Dataset Card for "processed_dwi_sobel_with_adc_cropped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
magicmachine/book-of-lore | ---
license: cc-by-nc-3.0
language:
- en
pretty_name: Book of Lore
---
# Forgotten Runes Wizards Cult Book of Lore Datasets
This repository contains snapshots of [The Book of Lore](https://www.forgottenrunes.com/lore), which is a collaborative tome
documenting the stories of the Wizards, Souls, Warriors, Ponies, and Beasts of the Runiverse.
## Guide to the datasets:
* `tokenized-book-of-lore-400.jsonl` - 400 token chunks encoded with tiktoken `cl100k_base` encoding
* `tokenized-book-of-lore-cl100k_base-400-text-embedding-ada-002.jsonl` - adds embeddings with `ada-002`
## See Also
* [Wizzypedia dataset](https://huggingface.co/datasets/magicmachine/wizzypedia) |
Jeska/autonlp-data-vaccinfaq | ---
task_categories:
- text-classification
---
# AutoNLP Dataset for project: vaccinfaq
## Table of content
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
## Dataset Descritpion
This dataset has been automatically processed by AutoNLP for project vaccinfaq.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"target": 6,
"text": "What je naam?"
},
{
"target": 6,
"text": "Hoe heet je?"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"target": "ClassLabel(num_classes=181, names=['chitchat_ask_bye', 'chitchat_ask_hi', 'chitchat_ask_hi_de', 'chitchat_ask_hi_en', 'chitchat_ask_hi_fr', 'chitchat_ask_hoe_gaat_het', 'chitchat_ask_name', 'chitchat_ask_thanks', 'faq_ask_aantal_gevaccineerd', 'faq_ask_aantal_gevaccineerd_wereldwijd', 'faq_ask_afspraak_afzeggen', 'faq_ask_afspraak_gemist', 'faq_ask_algemeen_info', 'faq_ask_allergisch_na_vaccinatie', 'faq_ask_alternatieve_medicatie', 'faq_ask_andere_vaccins', 'faq_ask_astrazeneca', 'faq_ask_astrazeneca_bij_ouderen', 'faq_ask_astrazeneca_bloedklonters', 'faq_ask_astrazeneca_prik_2', 'faq_ask_attest', 'faq_ask_autisme_na_vaccinatie', 'faq_ask_auto-immuun', 'faq_ask_begeleiding', 'faq_ask_beschermen', 'faq_ask_beschermingsduur', 'faq_ask_beschermingspercentage', 'faq_ask_besmetten_na_vaccin', 'faq_ask_betalen_voor_vaccin', 'faq_ask_betrouwbaar', 'faq_ask_betrouwbare_bronnen', 'faq_ask_bijsluiter', 'faq_ask_bijwerking_AZ', 'faq_ask_bijwerking_JJ', 'faq_ask_bijwerking_algemeen', 'faq_ask_bijwerking_lange_termijn', 'faq_ask_bijwerking_moderna', 'faq_ask_bijwerking_pfizer', 'faq_ask_bloed_geven', 'faq_ask_borstvoeding', 'faq_ask_buitenlander', 'faq_ask_chronisch_ziek', 'faq_ask_combi', 'faq_ask_complottheorie', 'faq_ask_complottheorie_5G', 'faq_ask_complottheorie_Bill_Gates', 'faq_ask_contra_ind', 'faq_ask_corona_is_griep', 'faq_ask_corona_vermijden', 'faq_ask_covid_door_vaccin', 'faq_ask_curevac', 'faq_ask_derde_prik', 'faq_ask_dna', 'faq_ask_duur_vaccinatie', 'faq_ask_eerst_weigeren', 'faq_ask_eerste_prik_buitenland', 'faq_ask_essentieel_beroep', 'faq_ask_experimenteel', 'faq_ask_foetus', 'faq_ask_geen_antwoord', 'faq_ask_geen_risicopatient', 'faq_ask_geen_uitnodiging', 'faq_ask_gestockeerd', 'faq_ask_gezondheidstoestand_gekend', 'faq_ask_gif_in_vaccin', 'faq_ask_goedkeuring', 'faq_ask_groepsimmuniteit', 'faq_ask_hartspierontsteking', 'faq_ask_hersenziekte', 'faq_ask_hoe_dodelijk', 'faq_ask_hoe_weet_overheid', 'faq_ask_hoeveel_dosissen', 'faq_ask_huisarts', 'faq_ask_huisdieren', 'faq_ask_iedereen', 'faq_ask_in_vaccin', 'faq_ask_info_vaccins', 'faq_ask_janssen', 'faq_ask_janssen_een_dosis', 'faq_ask_jong_en_gezond', 'faq_ask_keuze', 'faq_ask_keuze_vaccinatiecentrum', 'faq_ask_kinderen', 'faq_ask_kosjer_halal', 'faq_ask_leveringen', 'faq_ask_logistiek', 'faq_ask_logistiek_veilig', 'faq_ask_magnetisch', 'faq_ask_man_vrouw_verschillen', 'faq_ask_mantelzorger', 'faq_ask_maximaal_een_dosis', 'faq_ask_meer_bijwerkingen_tweede_dosis', 'faq_ask_minder_mobiel', 'faq_ask_moderna', 'faq_ask_mondmasker', 'faq_ask_motiveren', 'faq_ask_mrna_vs_andere_vaccins', 'faq_ask_naaldangst', 'faq_ask_nadelen', 'faq_ask_nuchter', 'faq_ask_ontwikkeling', 'faq_ask_onvruchtbaar', 'faq_ask_oplopen_vaccinatie', 'faq_ask_pfizer', 'faq_ask_phishing', 'faq_ask_pijnstiller', 'faq_ask_planning_eerstelijnszorg', 'faq_ask_planning_ouderen', 'faq_ask_positieve_test_na_vaccin', 'faq_ask_prioritaire_gropen', 'faq_ask_privacy', 'faq_ask_probleem_registratie', 'faq_ask_problemen_uitnodiging', 'faq_ask_quarantaine', 'faq_ask_qvax_probleem', 'faq_ask_reproductiegetal', 'faq_ask_risicopatient', 'faq_ask_risicopatient_diabetes', 'faq_ask_risicopatient_hartvaat', 'faq_ask_risicopatient_immuunziekte', 'faq_ask_risicopatient_kanker', 'faq_ask_risicopatient_luchtwegaandoening', 'faq_ask_smaakverlies', 'faq_ask_snel_ontwikkeld', 'faq_ask_sneller_aan_de_beurt', 'faq_ask_taxi', 'faq_ask_test_voor_vaccin', 'faq_ask_testen', 'faq_ask_tijd_tot_tweede_dosis', 'faq_ask_timing_andere_vaccins', 'faq_ask_trage_start', 'faq_ask_tweede_dosis_afspraak', 'faq_ask_tweede_dosis_vervroegen', 'faq_ask_twijfel_bijwerkingen', 'faq_ask_twijfel_effectiviteit', 'faq_ask_twijfel_inhoud', 'faq_ask_twijfel_ivm_vaccinatie', 'faq_ask_twijfel_noodzaak', 'faq_ask_twijfel_ontwikkeling', 'faq_ask_twijfel_praktisch', 'faq_ask_twijfel_vaccins_zelf', 'faq_ask_twijfel_vrijheid', 'faq_ask_uit_flacon', 'faq_ask_uitnodiging_afspraak_kwijt', 'faq_ask_uitnodiging_na_vaccinatie', 'faq_ask_vaccin_doorgeven', 'faq_ask_vaccin_immuunsysteem', 'faq_ask_vaccin_variant', 'faq_ask_vaccinatiecentrum', 'faq_ask_vaccine_covid_gehad', 'faq_ask_vaccine_covid_gehad_effect', 'faq_ask_vakantie', 'faq_ask_veelgestelde_vragen', 'faq_ask_vegan', 'faq_ask_verplicht', 'faq_ask_verschillen', 'faq_ask_vrijwillig_Janssen', 'faq_ask_vrijwilliger', 'faq_ask_waar_en_wanneer', 'faq_ask_waarom', 'faq_ask_waarom_niet_verplicht', 'faq_ask_waarom_ouderen_eerst', 'faq_ask_waarom_twee_prikken', 'faq_ask_waarom_twijfel', 'faq_ask_wanneer_algemene_bevolking', 'faq_ask_wanneer_iedereen_gevaccineerd', 'faq_ask_wat_is_corona', 'faq_ask_wat_is_rna', 'faq_ask_wat_is_vaccin', 'faq_ask_wat_na_vaccinatie', 'faq_ask_welk_vaccin_krijg_ik', 'faq_ask_welke_vaccin', 'faq_ask_wie_ben_ik', 'faq_ask_wie_doet_inenting', 'faq_ask_wie_is_risicopatient', 'faq_ask_wie_nu', 'faq_ask_wilsonbekwaam', 'faq_ask_zwanger', 'get_started', 'nlu_fallback', 'test'], names_file=None, id=None)",
"text": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 11651 |
| valid | 1267 |
|
justinlamlamlam/wiki_context_open_orca | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 7174975
num_examples: 424
download_size: 4000054
dataset_size: 7174975
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
daven3/geosignal | ---
license: apache-2.0
task_categories:
- question-answering
---
## Instruction Tuning: GeoSignal
Scientific domain adaptation has two main steps during instruction tuning.
- Instruction tuning with general instruction-tuning data. Here we use Alpaca-GPT4.
- Instruction tuning with restructured domain knowledge, which we call expertise instruction tuning. For K2, we use knowledge-intensive instruction data, GeoSignal.
***The following is the illustration of the training domain-specific language model recipe:***

- **Adapter Model on [Huggingface](https://huggingface.co/): [daven3/k2_it_adapter](https://huggingface.co/daven3/k2_it_adapter)**
For the design of the GeoSignal, we collect knowledge from various data sources, like:

GeoSignal is designed for knowledge-intensive instruction tuning and used for aligning with experts.
The full-version will be upload soon, or email [daven](mailto:davendw@sjtu.edu.cn) for potential research cooperation.
|
Atipico1/nq-output | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: answers
sequence: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: conflict_case
list:
- name: answer
dtype: string
- name: conflict_context
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: context
dtype: string
- name: context_vague
dtype: string
splits:
- name: train
num_bytes: 155060836
num_examples: 10000
- name: test
num_bytes: 56240742
num_examples: 3610
download_size: 120629521
dataset_size: 211301578
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
notrichardren/ms_tf | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Topic
dtype: string
- name: Question
dtype: string
- name: Correct
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 236742
num_examples: 2252
download_size: 110349
dataset_size: 236742
---
# Dataset Card for "ms_tf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/nfcorpus_dev_video | ---
pretty_name: '`nfcorpus/dev/video`'
viewer: false
source_datasets: ['irds/nfcorpus']
task_categories:
- text-retrieval
---
# Dataset Card for `nfcorpus/dev/video`
The `nfcorpus/dev/video` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/nfcorpus#nfcorpus/dev/video).
# Data
This dataset provides:
- `queries` (i.e., topics); count=102
- `qrels`: (relevance assessments); count=3,068
- For `docs`, use [`irds/nfcorpus`](https://huggingface.co/datasets/irds/nfcorpus)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/nfcorpus_dev_video', 'queries')
for record in queries:
record # {'query_id': ..., 'title': ..., 'desc': ...}
qrels = load_dataset('irds/nfcorpus_dev_video', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Boteva2016Nfcorpus,
title="A Full-Text Learning to Rank Dataset for Medical Information Retrieval",
author = "Vera Boteva and Demian Gholipour and Artem Sokolov and Stefan Riezler",
booktitle = "Proceedings of the European Conference on Information Retrieval ({ECIR})",
location = "Padova, Italy",
publisher = "Springer",
year = 2016
}
```
|
fxmeng/llava-finetune | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2481431976
num_examples: 3444246
download_size: 443612855
dataset_size: 2481431976
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llava-finetune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kanoyo/kanoyo-rvc-fork | ---
license: mit
---
|
azhx/counterfact-easy | ---
dataset_info:
features:
- name: subject
dtype: string
- name: proposition
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
- name: case_id
dtype: int64
splits:
- name: train
num_bytes: 3112032.700396916
num_examples: 39455
- name: test
num_bytes: 345711.2996030841
num_examples: 4383
download_size: 1618051
dataset_size: 3457744.0
---
# Dataset Card for "counterfact-easy"
The dataset form ROME, but simplified to be just the main assertions (no paraphrased prompts included) |
zolak/twitter_dataset_1713006215 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: float64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 4095315
num_examples: 10098
download_size: 2027196
dataset_size: 4095315
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ICML2022/EfficientDatasetCondensation | ---
license: mit
data_type: image (0-1 ranged float)
---
### Data summary
- This repository contains small synthetic data for Image datasets; MNIST, SVHN, and CIFAR-10.
- Each torch file contains the images and corresponding labels of sizes ranging from 1,10,50 images per class (IPC).
- For more details, please refer to our GitHub page and paper below.
### Reference
https://github.com/snu-mllab/Efficient-Dataset-Condensation
### Citation
```
@inproceedings{kimICML22,
title = {Dataset Condensation via Efficient Synthetic-Data Parameterization},
author = {Kim, Jang-Hyun and Kim, Jinuk and Oh, Seong Joon and Yun, Sangdoo and Song, Hwanjun and Jeong, Joonhyun and Ha, Jung-Woo and Song, Hyun Oh},
booktitle = {International Conference on Machine Learning (ICML)},
year = {2022}
}
``` |
edbeeching/sample_factory_videos | ---
license: mit
---
|
SalomonMetre13/nnd_fr_14k | ---
license: mit
language:
- nnd
task_categories:
- translation
size_categories:
- 10K<n<100K
---
This <span style="color:teal;">parallel corpus </span> contains <span style="color:teal;">14,478</span> aligned sentence pairs <span style="color:teal;">Nande-French</span> in a <span style="color:teal;">90:10</span> split for the train and the test sets. It has been mainly used to fine-tune the <span style="color:teal;"> t5-base </span> pretrained model for the development of <a href="https://huggingface.co/SalomonMetre13/nnd_fr_mt" style="color:green;">this translation model </a> |
CyberHarem/makabe_mizuki_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of makabe_mizuki/真壁瑞希 (THE iDOLM@STER: Million Live!)
This is the dataset of makabe_mizuki/真壁瑞希 (THE iDOLM@STER: Million Live!), containing 500 images and their tags.
The core tags of this character are `purple_hair, short_hair, yellow_eyes, bangs, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 567.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makabe_mizuki_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 344.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makabe_mizuki_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1133 | 704.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makabe_mizuki_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 509.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makabe_mizuki_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1133 | 971.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makabe_mizuki_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/makabe_mizuki_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, beret, long_sleeves, solo, hairclip, jacket, looking_at_viewer, shirt, blush, sweater, upper_body, x_hair_ornament, black_headwear, black_ribbon, neck_ribbon, open_clothes, skirt, smile |
| 1 | 6 |  |  |  |  |  | 1girl, blue_shirt, looking_at_viewer, short_sleeves, white_background, blue_skirt, expressionless, pleated_skirt, simple_background, solo, small_breasts, black_skirt, checkered_necktie, green_necktie, light_blush, wavy_hair |
| 2 | 5 |  |  |  |  |  | 1girl, blue_shirt, looking_at_viewer, short_sleeves, solo, upper_body, collared_shirt, green_necktie, simple_background, white_background, wing_collar, blush, plaid_necktie, smile |
| 3 | 5 |  |  |  |  |  | 1girl, blush, floral_print, hair_flower, looking_at_viewer, obi, solo, blue_kimono, upper_body, print_kimono, white_background, yukata, festival, holding_stuffed_toy, object_hug, simple_background, stuffed_shark, wide_sleeves |
| 4 | 7 |  |  |  |  |  | bare_shoulders, looking_at_viewer, blush, frilled_dress, hairband, heart, white_gloves, 1girl, bow, hair_flower, holding_card, playing_card, puffy_short_sleeves, solo, detached_collar, expressionless, hands_up, orange_dress, white_collar, buttons, ribbon, smile, wavy_hair, wrist_cuffs |
| 5 | 6 |  |  |  |  |  | navel, nipples, small_breasts, 1girl, female_pubic_hair, solo, blush, completely_nude, looking_at_viewer, open_mouth |
| 6 | 6 |  |  |  |  |  | fake_mustache, long_sleeves, 1girl, black_jacket, black_pants, red_bowtie, solo, white_shirt, buttons, glasses, monocle, center_frills, frilled_sleeves, hat, holding, looking_at_viewer |
| 7 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, blush, small_breasts, simple_background, white_bikini, frilled_bikini, white_background, armpits, necklace |
| 8 | 7 |  |  |  |  |  | 1boy, 1girl, blush, female_pubic_hair, hetero, solo_focus, nipples, small_breasts, necktie, penis, pussy, spread_legs, sweat, anus, bar_censor, blue_skirt, kneehighs, navel, no_bra, one_eye_closed, underwear |
| 9 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, detached_sleeves, frills, small_breasts, white_background, blue_headwear, closed_mouth, mini_hat, simple_background, upper_body, black_sleeves, blue_dress, blue_sleeves, blush, bow, collarbone, parted_lips, serious, shorts, wavy_hair |
| 10 | 5 |  |  |  |  |  | 1girl, black_dress, frills, looking_at_viewer, solo, string, hair_flower, juliet_sleeves, lolita_fashion, parted_lips, ribbon, black_bow, black_pantyhose, black_rose, blue_bow, blue_flower, petals, simple_background, small_breasts, white_background |
| 11 | 6 |  |  |  |  |  | 1girl, athletic_leotard, simple_background, small_breasts, two-tone_leotard, white_background, white_leotard, white_pantyhose, looking_at_viewer, solo, star_print, gymnastics, split, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | beret | long_sleeves | solo | hairclip | jacket | looking_at_viewer | shirt | blush | sweater | upper_body | x_hair_ornament | black_headwear | black_ribbon | neck_ribbon | open_clothes | skirt | smile | blue_shirt | short_sleeves | white_background | blue_skirt | expressionless | pleated_skirt | simple_background | small_breasts | black_skirt | checkered_necktie | green_necktie | light_blush | wavy_hair | collared_shirt | wing_collar | plaid_necktie | floral_print | hair_flower | obi | blue_kimono | print_kimono | yukata | festival | holding_stuffed_toy | object_hug | stuffed_shark | wide_sleeves | bare_shoulders | frilled_dress | hairband | heart | white_gloves | bow | holding_card | playing_card | puffy_short_sleeves | detached_collar | hands_up | orange_dress | white_collar | buttons | ribbon | wrist_cuffs | navel | nipples | female_pubic_hair | completely_nude | open_mouth | fake_mustache | black_jacket | black_pants | red_bowtie | white_shirt | glasses | monocle | center_frills | frilled_sleeves | hat | holding | white_bikini | frilled_bikini | armpits | necklace | 1boy | hetero | solo_focus | necktie | penis | pussy | spread_legs | sweat | anus | bar_censor | kneehighs | no_bra | one_eye_closed | underwear | detached_sleeves | frills | blue_headwear | closed_mouth | mini_hat | black_sleeves | blue_dress | blue_sleeves | collarbone | parted_lips | serious | shorts | black_dress | string | juliet_sleeves | lolita_fashion | black_bow | black_pantyhose | black_rose | blue_bow | blue_flower | petals | athletic_leotard | two-tone_leotard | white_leotard | white_pantyhose | star_print | gymnastics | split | standing |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:---------------|:-------|:-----------|:---------|:--------------------|:--------|:--------|:----------|:-------------|:------------------|:-----------------|:---------------|:--------------|:---------------|:--------|:--------|:-------------|:----------------|:-------------------|:-------------|:-----------------|:----------------|:--------------------|:----------------|:--------------|:--------------------|:----------------|:--------------|:------------|:-----------------|:--------------|:----------------|:---------------|:--------------|:------|:--------------|:---------------|:---------|:-----------|:----------------------|:-------------|:----------------|:---------------|:-----------------|:----------------|:-----------|:--------|:---------------|:------|:---------------|:---------------|:----------------------|:------------------|:-----------|:---------------|:---------------|:----------|:---------|:--------------|:--------|:----------|:--------------------|:------------------|:-------------|:----------------|:---------------|:--------------|:-------------|:--------------|:----------|:----------|:----------------|:------------------|:------|:----------|:---------------|:-----------------|:----------|:-----------|:-------|:---------|:-------------|:----------|:--------|:--------|:--------------|:--------|:-------|:-------------|:------------|:---------|:-----------------|:------------|:-------------------|:---------|:----------------|:---------------|:-----------|:----------------|:-------------|:---------------|:-------------|:--------------|:----------|:---------|:--------------|:---------|:-----------------|:-----------------|:------------|:------------------|:-------------|:-----------|:--------------|:---------|:-------------------|:-------------------|:----------------|:------------------|:-------------|:-------------|:--------|:-----------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | X | | | X | | X | | X | | | | | | | X | X | X | X | | | | X | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | | X | | X | | X | | | | | | | | | | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | | X | | X | | | | | | | | | X | | | | | X | | | | | | | | X | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | | | X | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | | X | | | X | | X | | | | | | | | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | | | X | | X | | X | | | | | | | | | | X | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | | | X | | | X | | | | | | | | | | | | | | X | | | | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | | | X | | | X | | | | | | | | | | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
distilled-from-one-sec-cv12/chunk_69 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1047333428
num_examples: 204079
download_size: 1063265404
dataset_size: 1047333428
---
# Dataset Card for "chunk_69"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Heigke/stanford-enigma-philosophy-chat |
---
license: cc
---
- **Curated by:** [Heigke](https://github.com/Heigke)
- **Funded by:** [r3tex](https://github.com/r3tex)
- **Shared by:** [Project Nephilim](https://github.com/r3tex/nephilim)
- **Language(s) (NLP):** English
- **License:** CC
# Dataset Card for stanford-enigma-philosophy-chat dataset
Roughly 27k questions and answers inspired by articles from [Stanford Encyclopedia of Philosophy](https://plato.stanford.edu/).
The questions range all the way from Zombies to the concept of Abduction, from Metaphysics to Neuroethics and thus cover some of the essence of mathematics, logic and philosophy.
## Dataset Details
The dataset is simple and is ready for use by eg. [Mixtral 8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) in [LLama-Factory](https://github.com/hiyouga/LLaMA-Factory) with its three entries: input, instruction and output.
### Dataset Description/How to use
#### Format
The dataset is a json file with the following format:
```
[
{
"instruction": "You are an expert in philosophy.",
"input": "How would you describe William Whewell's contribution to philosophy and science in the 19th century Britain?",
"output": "William Whewell was a polymath who wrote extensively on various subjects including mechanics, mineralogy, geology, astronomy, political economy, theology, educational reform, internati>
},
{
"instruction": "You are an expert in philosophy.",
"input": "What were the key influences and associations in William Whewell's life?",
"output": "William Whewell was greatly influenced by his association with three of his fellow students at Cambridge: Charles Babbage, John Herschel, and Richard Jones."
}
]
```
#### How to use with transformers dataset
```
from datasets import load_dataset
dataset = load_dataset("Heigke/stanford-enigma-philosophy-chat")
```
#### How to use with LLama-Factory
Alter the dataset_info.json at LLaMa-Factory/data with an extra entry like below:
```
{
"stanford-enigma-philosophy-chat": {
"hf_hub_url": "Heigke/stanford-enigma-philosophy-chat"
},
"philosophy": {
"file_name": "cleaned_philosophy_dataset.json",
"file_sha1": "3a771f4d524d513be37d8d31166274d3a18a610d"
},
"alpaca_en": {
"file_name": "alpaca_data_en_52k.json",
...
```
Then use the flag ``` --dataset stanford-enigma-philosophy-chat```
Like this for example if you want to qlora train mixtral with flash attention:
```
CUDA_VISIBLE_DEVICES=2 python3 src/train_bash.py --stage sft --do_train --model_name_or_path mistralai/Mixtral-8x7B-Instruct-v0.1 --dataset stanford-enigma-philosophy-chat --template mistral --finetuning_type lora --lora_target q_proj,v_proj --output_dir path_to_sft_checkpoint_hf --overwrite_cache --per_device_train_batch_size 4 --gradient_accumulation_steps 4 --lr_scheduler_type cosine --logging_steps 10 --save_steps 1000 --learning_rate 5e-5 --num_train_epochs 3.0 --plot_loss --flash_attn --quantization_bit 4 --cache_dir /mnt/hdd1
```
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** -
- **Paper [optional]:** Coming
- **Demo [optional]:** Coming
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Ubaidbhat/QAGeniusPresentation | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: source_doc
dtype: string
- name: groundedness_score
dtype: int64
- name: groundedness_eval
dtype: string
- name: relevance_score
dtype: int64
- name: relevance_eval
dtype: string
splits:
- name: train
num_bytes: 2747
num_examples: 1
download_size: 22364
dataset_size: 2747
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bri25yu/flores200_incomplete | ---
dataset_info:
features:
- name: id
dtype: int32
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 15697219153
num_examples: 20480000
- name: val
num_bytes: 3827042
num_examples: 5000
- name: test
num_bytes: 7670994
num_examples: 10000
download_size: 7817630008
dataset_size: 15708717189
---
# Dataset Card for "flores200_incomplete"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aierwiki/poker_face | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
0: d_A
1: d_2
2: d_3
3: d_4
4: d_5
5: d_6
6: d_7
7: d_8
8: d_9
9: d_10
10: d_J
11: d_Q
12: d_K
13: c_A
14: c_2
15: c_3
16: c_4
17: c_5
18: c_6
19: c_7
20: c_8
21: c_9
22: c_10
23: c_J
24: c_Q
25: c_K
26: h_A
27: h_2
28: h_3
29: h_4
30: h_5
31: h_6
32: h_7
33: h_8
34: h_9
35: h_10
36: h_J
37: h_Q
38: h_K
39: s_A
40: s_2
41: s_3
42: s_4
43: s_5
44: s_6
45: s_7
46: s_8
47: s_9
48: s_10
49: s_J
50: s_Q
51: s_K
splits:
- name: train
num_bytes: 1552058361.0
num_examples: 4500
- name: validation
num_bytes: 470429028.56
num_examples: 1140
download_size: 2127767402
dataset_size: 2022487389.56
---
# Dataset Card for "poker_face"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/takamatsu_tomori_bangdreamitsmygo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Takamatsu Tomori
This is the dataset of Takamatsu Tomori, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 427 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 427 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 427 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 427 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
thercyl/PG | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: float64
- name: Ticker
dtype: string
- name: Year
dtype: string
- name: Text
dtype: string
- name: Embedding
dtype: string
splits:
- name: train
num_bytes: 103619495
num_examples: 2979
download_size: 55479679
dataset_size: 103619495
---
# Dataset Card for "PG"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CoruNethron__neu-sai-it1 | ---
pretty_name: Evaluation run of CoruNethron/neu-sai-it1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CoruNethron/neu-sai-it1](https://huggingface.co/CoruNethron/neu-sai-it1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CoruNethron__neu-sai-it1_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-21T19:30:24.351070](https://huggingface.co/datasets/open-llm-leaderboard/details_CoruNethron__neu-sai-it1_public/blob/main/results_2023-11-21T19-30-24.351070.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5949297319149666,\n\
\ \"acc_stderr\": 0.03274268078653866,\n \"acc_norm\": 0.6054937730425815,\n\
\ \"acc_norm_stderr\": 0.03355540671285046,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677154,\n \"mc2\": 0.5148628224777658,\n\
\ \"mc2_stderr\": 0.015540287053669583,\n \"em\": 0.3584312080536913,\n\
\ \"em_stderr\": 0.004910934869746984,\n \"f1\": 0.4530736157718142,\n\
\ \"f1_stderr\": 0.004671764766418761\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6184027086237801,\n\
\ \"acc_stderr\": 0.00484785754695748,\n \"acc_norm\": 0.8138816968731328,\n\
\ \"acc_norm_stderr\": 0.0038840668811314745\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n\
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n\
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150016,\n\
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150016\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936066,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936066\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.01424887354921756,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.01424887354921756\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172554,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172554\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464496,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464496\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.01267190278256765,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.01267190278256765\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.030042615832714864,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.030042615832714864\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677154,\n \"mc2\": 0.5148628224777658,\n\
\ \"mc2_stderr\": 0.015540287053669583\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.3584312080536913,\n \
\ \"em_stderr\": 0.004910934869746984,\n \"f1\": 0.4530736157718142,\n \
\ \"f1_stderr\": 0.004671764766418761\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.02880970432145565,\n \"acc_stderr\": 0.004607484283767454\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CoruNethron/neu-sai-it1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|arc:challenge|25_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|drop|3_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|gsm8k|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hellaswag|10_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-21T19-30-24.351070.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-21T19-30-24.351070.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- '**/details_harness|winogrande|5_2023-11-21T19-30-24.351070.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-21T19-30-24.351070.parquet'
- config_name: results
data_files:
- split: 2023_11_21T19_30_24.351070
path:
- results_2023-11-21T19-30-24.351070.parquet
- split: latest
path:
- results_2023-11-21T19-30-24.351070.parquet
---
# Dataset Card for Evaluation run of CoruNethron/neu-sai-it1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CoruNethron/neu-sai-it1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CoruNethron/neu-sai-it1](https://huggingface.co/CoruNethron/neu-sai-it1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CoruNethron__neu-sai-it1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-21T19:30:24.351070](https://huggingface.co/datasets/open-llm-leaderboard/details_CoruNethron__neu-sai-it1_public/blob/main/results_2023-11-21T19-30-24.351070.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5949297319149666,
"acc_stderr": 0.03274268078653866,
"acc_norm": 0.6054937730425815,
"acc_norm_stderr": 0.03355540671285046,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677154,
"mc2": 0.5148628224777658,
"mc2_stderr": 0.015540287053669583,
"em": 0.3584312080536913,
"em_stderr": 0.004910934869746984,
"f1": 0.4530736157718142,
"f1_stderr": 0.004671764766418761
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6184027086237801,
"acc_stderr": 0.00484785754695748,
"acc_norm": 0.8138816968731328,
"acc_norm_stderr": 0.0038840668811314745
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150016,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936066,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936066
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.01424887354921756,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.01424887354921756
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172554,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172554
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464496,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464496
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.01267190278256765,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.01267190278256765
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.030042615832714864,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.030042615832714864
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677154,
"mc2": 0.5148628224777658,
"mc2_stderr": 0.015540287053669583
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|drop|3": {
"em": 0.3584312080536913,
"em_stderr": 0.004910934869746984,
"f1": 0.4530736157718142,
"f1_stderr": 0.004671764766418761
},
"harness|gsm8k|5": {
"acc": 0.02880970432145565,
"acc_stderr": 0.004607484283767454
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
GoshaLetov/calc-qa-augment-sft-raw | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: question_bt_wmt19_1
dtype: string
- name: answer_bt_wmt19_1
dtype: string
- name: question_bt_wmt19_2
dtype: string
- name: answer_bt_wmt19_2
dtype: string
- name: question_bt_wmt19_3
dtype: string
- name: answer_bt_wmt19_3
dtype: string
- name: question_bt_wmt19_5
dtype: string
- name: answer_bt_wmt19_5
dtype: string
- name: question_bt_opus_1
dtype: string
- name: answer_bt_opus_1
dtype: string
- name: question_bt_opus_2
dtype: string
- name: answer_bt_opus_2
dtype: string
- name: question_bt_opus_3
dtype: string
- name: answer_bt_opus_3
dtype: string
- name: question_bt_opus_5
dtype: string
- name: answer_bt_opus_5
dtype: string
- name: question_pt_mt5small
dtype: string
- name: answer_pt_mt5small
dtype: string
- name: question_pt_mt5base
dtype: string
- name: answer_pt_mt5base
dtype: string
- name: question_pt_rut5
dtype: string
- name: answer_pt_rut5
dtype: string
splits:
- name: train
num_bytes: 463950
num_examples: 69
download_size: 253794
dataset_size: 463950
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "calc-qa-augment-sft-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
knlp/pegasus-ft-slic | ---
license: apache-2.0
dataset_info:
features:
- name: passage
dtype: string
- name: summary
dtype: string
- name: candidates
sequence: string
splits:
- name: train
num_bytes: 13330044
num_examples: 5400
download_size: 7205274
dataset_size: 13330044
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yaoqi/test | ---
license: apache-2.0
---
|
AisotTechnologies/aisot_btc_lob_trades | ---
license: cc-by-nc-sa-4.0
tags:
- finance
- time-series
---
This dataset consists of snapshots of limit order books and trades for BTC/USD (i.e. the Bitcoin / US dollars currency pair) from May 31, 2018 9:55 pm (UTC) through September 30, 2018 9:59 pm (UTC) from the Bitstamp exchange (https://www.bitstamp.net).
The data has been collected by Aisot Technologies AG, Zürich (www.aisot.com). Trade data is on a millisecond frequency. Limit order book snapshots are on minute frequency, with aggregated amounts for each price level with depth up to 5000 for each bid/ask side. For more information about the dataset, please refer to the citation below.
The data is provided “as is” without any warranties. A short approval process is required before accessing the data.
By accessing the dataset, you accept to not disseminate it elsewhere and to adhere to the
cc-by-nc-sa-4.0 license agreement.
Note, we approve requests with full name (first and last name) and email only.
How to cite the dataset: Antulov-Fantulin, N., Guo, T. & Lillo, F. (2021). “Temporal mixture ensemble models for probabilistic forecasting of intraday cryptocurrency volume.” In: Decisions Econ. Finan. 44, pp. 905–940. https://doi.org/10.1007/s10203-021-00344-9 |
zurlog/subset_wlabels | ---
license: cc-by-4.0
---
|
PleIAs/Post-OCR | Invalid username or password. |
anyspeech/mswc_test | ---
configs:
- config_name: default
data_files:
- split: query
path: data/query-*
- split: candidate
path: data/candidate-*
dataset_info:
features:
- name: key
dtype: string
- name: phones
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: sampling_rate
dtype: int64
splits:
- name: query
num_bytes: 213251381
num_examples: 1665
- name: candidate
num_bytes: 213251405
num_examples: 1665
download_size: 40945132
dataset_size: 426502786
---
# Dataset Card for "mswc_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_postbot__gpt-neo-1.3B-emailgen | ---
pretty_name: Evaluation run of postbot/gpt-neo-1.3B-emailgen
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [postbot/gpt-neo-1.3B-emailgen](https://huggingface.co/postbot/gpt-neo-1.3B-emailgen)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__gpt-neo-1.3B-emailgen\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T19:11:14.662804](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__gpt-neo-1.3B-emailgen/blob/main/results_2024-01-10T19-11-14.662804.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24490027036588977,\n\
\ \"acc_stderr\": 0.030358881954874864,\n \"acc_norm\": 0.24614205399486563,\n\
\ \"acc_norm_stderr\": 0.031165759888036278,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.01520152224629997,\n \"mc2\": 0.4254807884462743,\n\
\ \"mc2_stderr\": 0.014689896884097952\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2525597269624573,\n \"acc_stderr\": 0.012696728980207708,\n\
\ \"acc_norm\": 0.29948805460750855,\n \"acc_norm_stderr\": 0.013385021637313569\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38020314678350925,\n\
\ \"acc_stderr\": 0.004844445265582649,\n \"acc_norm\": 0.4794861581358295,\n\
\ \"acc_norm_stderr\": 0.004985580065946457\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \
\ \"acc_stderr\": 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"\
acc_norm_stderr\": 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n\
\ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415433,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.18064516129032257,\n \"acc_stderr\": 0.021886178567172534,\n \"\
acc_norm\": 0.18064516129032257,\n \"acc_norm_stderr\": 0.021886178567172534\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.22660098522167488,\n \"acc_stderr\": 0.029454863835292975,\n \"\
acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.029454863835292975\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229876,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229876\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246787,\n\
\ \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246787\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.02772206549336126,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.02772206549336126\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436777,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436777\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790222,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790222\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1574074074074074,\n \"acc_stderr\": 0.02483717351824239,\n \"\
acc_norm\": 0.1574074074074074,\n \"acc_norm_stderr\": 0.02483717351824239\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598018,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598018\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749482,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749482\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24393358876117496,\n\
\ \"acc_stderr\": 0.015357212665829484,\n \"acc_norm\": 0.24393358876117496,\n\
\ \"acc_norm_stderr\": 0.015357212665829484\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044276,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044276\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729505,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729505\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2057877813504823,\n\
\ \"acc_stderr\": 0.022961339906764237,\n \"acc_norm\": 0.2057877813504823,\n\
\ \"acc_norm_stderr\": 0.022961339906764237\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290392,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290392\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.21642764015645372,\n\
\ \"acc_stderr\": 0.010517798313579914,\n \"acc_norm\": 0.21642764015645372,\n\
\ \"acc_norm_stderr\": 0.010517798313579914\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24019607843137256,\n \"acc_stderr\": 0.017282760695167425,\n \
\ \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.017282760695167425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.028996909693328927,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.028996909693328927\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.03484331592680588,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.03484331592680588\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.01520152224629997,\n \"mc2\": 0.4254807884462743,\n\
\ \"mc2_stderr\": 0.014689896884097952\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5627466456195738,\n \"acc_stderr\": 0.01394139331069592\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/postbot/gpt-neo-1.3B-emailgen
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|arc:challenge|25_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|gsm8k|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hellaswag|10_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T19-11-14.662804.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T19-11-14.662804.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- '**/details_harness|winogrande|5_2024-01-10T19-11-14.662804.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T19-11-14.662804.parquet'
- config_name: results
data_files:
- split: 2024_01_10T19_11_14.662804
path:
- results_2024-01-10T19-11-14.662804.parquet
- split: latest
path:
- results_2024-01-10T19-11-14.662804.parquet
---
# Dataset Card for Evaluation run of postbot/gpt-neo-1.3B-emailgen
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [postbot/gpt-neo-1.3B-emailgen](https://huggingface.co/postbot/gpt-neo-1.3B-emailgen) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_postbot__gpt-neo-1.3B-emailgen",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T19:11:14.662804](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__gpt-neo-1.3B-emailgen/blob/main/results_2024-01-10T19-11-14.662804.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24490027036588977,
"acc_stderr": 0.030358881954874864,
"acc_norm": 0.24614205399486563,
"acc_norm_stderr": 0.031165759888036278,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.01520152224629997,
"mc2": 0.4254807884462743,
"mc2_stderr": 0.014689896884097952
},
"harness|arc:challenge|25": {
"acc": 0.2525597269624573,
"acc_stderr": 0.012696728980207708,
"acc_norm": 0.29948805460750855,
"acc_norm_stderr": 0.013385021637313569
},
"harness|hellaswag|10": {
"acc": 0.38020314678350925,
"acc_stderr": 0.004844445265582649,
"acc_norm": 0.4794861581358295,
"acc_norm_stderr": 0.004985580065946457
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254366,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254366
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.022101128787415433,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.022101128787415433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18064516129032257,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.18064516129032257,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.029454863835292975,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.029454863835292975
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229876,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229876
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246787,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246787
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.02772206549336126,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.02772206549336126
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436777,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436777
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790222,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790222
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1574074074074074,
"acc_stderr": 0.02483717351824239,
"acc_norm": 0.1574074074074074,
"acc_norm_stderr": 0.02483717351824239
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598018,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749482,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749482
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24393358876117496,
"acc_stderr": 0.015357212665829484,
"acc_norm": 0.24393358876117496,
"acc_norm_stderr": 0.015357212665829484
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044276,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044276
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729505,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729505
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2057877813504823,
"acc_stderr": 0.022961339906764237,
"acc_norm": 0.2057877813504823,
"acc_norm_stderr": 0.022961339906764237
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290392,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290392
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.21642764015645372,
"acc_stderr": 0.010517798313579914,
"acc_norm": 0.21642764015645372,
"acc_norm_stderr": 0.010517798313579914
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.017282760695167425,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.017282760695167425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328927,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328927
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.03484331592680588,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.03484331592680588
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.01520152224629997,
"mc2": 0.4254807884462743,
"mc2_stderr": 0.014689896884097952
},
"harness|winogrande|5": {
"acc": 0.5627466456195738,
"acc_stderr": 0.01394139331069592
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pnadel/jfk_senior_thesis_data | ---
dataset_info:
features:
- name: index
dtype: int64
- name: collection
dtype: string
- name: packageId
dtype: string
- name: granuleId
dtype: string
- name: title
dtype: string
- name: detailsLink
dtype: string
- name: pdfLink
dtype: string
- name: htmlLink
dtype: string
- name: xmlLink
dtype: string
- name: otherLink1
dtype: string
- name: otherLink2
dtype: float64
- name: teaser
dtype: string
- name: historical
dtype: float64
- name: publishdate
dtype: string
- name: president
dtype: string
- name: full_text
dtype: string
- name: url_to_use
dtype: string
- name: path_to_text
dtype: string
splits:
- name: train
num_bytes: 3121664312
num_examples: 4908
download_size: 1609034276
dataset_size: 3121664312
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "jfk_senior_thesis_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/1000_People_French_Handwriting_OCR_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
1,000 People - French Handwriting OCR Data. The writers are Europeans who often write French. The device is scanner, the collection angle is eye-level angle. The dataset content includes address, company name, personal name.The dataset can be used for tasks such as French handwriting OCR.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1359?source=Huggingface
## Data size
1,000 people, each subject collects 14 images
## Population distribution
gender distribution: 455 males, 555 females; age distribution: 10 people under 18 years old, 980 people from 18 to 45 years old, 10 people from 46 to 60 years old
## Writer
Europeans who often write French
## Collecting environment
pure color background
## Device
scanner
## Photographic angle
eye-level angle
## Data format
the image data format is .png
## Data content
including address, company name, personal name
## Accuracy rate
the collection content accuracy is not less than 97%
# Licensing Information
Commercial License
|
AlekseyKorshuk/DotCHA-100k-2D-v2 | ---
dataset_info:
features:
- name: '0'
dtype: string
- name: '1'
dtype: string
- name: letter
sequence: int64
- name: buckets
sequence:
sequence:
sequence: float64
splits:
- name: train
num_bytes: 5305760833
num_examples: 100000
download_size: 3707551387
dataset_size: 5305760833
---
# Dataset Card for "DotCHA-100k-2D-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
c4ba/minhavoz98 | ---
license: openrail
---
|
mucc001/scirepeval_fos_test | ---
license: unknown
dataset_info:
features:
- name: paper_id
dtype: string
- name: label
sequence: int64
splits:
- name: train
num_bytes: 9377971
num_examples: 53133
- name: test
num_bytes: 82664
num_examples: 468
download_size: 703027
dataset_size: 9460635
---
|
TopicNet/RTL-Wiki | ---
language:
- en
multilinguality:
- monolingual
license: other
license_name: topicnet
license_link: >-
https://github.com/machine-intelligence-laboratory/TopicNet/blob/master/LICENSE.txt
configs:
- config_name: "rtl-wiki"
default: true
data_files:
- split: train
path: "data/RTL_Wiki.csv.gz"
- config_name: "rtl-wiki-person"
data_files:
- split: train
path: "data/RTL_Wiki_person.csv.gz"
task_categories:
- text-classification
task_ids:
- topic-classification
- multi-class-classification
- multi-label-classification
tags:
- topic-modeling
- topic-modelling
- text-clustering
- multimodal-data
- multimodal-learning
- modalities
- document-representation
---
# RTL-Wiki
Some measurable characteristics of the dataset:
* D — number of documents
* <modality name> W — modality dictionary size (number of unique tokens)
* <modality name> len D — average document length in modality tokens (number of tokens)
* <modality name> len D uniq — average document length in unique modality tokens (number of unique tokens)
| | D | @lemmatized W | @lemmatized len D | @lemmatized len D uniq | @bigram W | @bigram len D | @bigram len D uniq |
|:------|------------:|-----------------------:|---------------------------:|--------------------------------:|-------------------:|-----------------------:|----------------------------:|
| value | 7838 | 1.28065e+07 | 1633.9 | 691.157 | 503619 | 64.2535 | 30.8372 |
Information about document lengths in modality tokens:
| | len_total@lemmatized | len_total@bigram | len_uniq@lemmatized | len_uniq@bigram |
|:-----|-----------------------:|-------------------:|----------------------:|------------------:|
| mean | 1633.9 | 64.2535 | 691.157 | 30.8372 |
| std | 1565.19 | 73.1737 | 521.463 | 28.071 |
| min | 2 | 0 | 2 | 0 |
| 25% | 500 | 18 | 283 | 11 |
| 50% | 1115.5 | 41 | 554 | 22 |
| 75% | 2233.5 | 85 | 961 | 42 |
| max | 15851 | 1098 | 4184 | 283 |
## RTL-Wiki-Person
A version of the dataset filtered by person. It contains only 1201 documents.
Some measurable characteristics of the dataset:
| | D | @lemmatized W | @lemmatized len D | @lemmatized len D uniq | @bigram W | @bigram len D | @bigram len D uniq |
|:------|------------:|-----------------------:|---------------------------:|--------------------------------:|-------------------:|-----------------------:|----------------------------:|
| value | 1201 | 1.92167e+06 | 1600.06 | 729.93 | 371430 | 309.267 | 196.595 |
Information about document lengths in modality tokens:
| | len_total@lemmatized | len_total@bigram | len_uniq@lemmatized | len_uniq@bigram |
|:-----|-----------------------:|-------------------:|----------------------:|------------------:|
| mean | 1600.06 | 309.267 | 729.93 | 196.595 |
| std | 1569.31 | 323.991 | 541.153 | 170.06 |
| min | 73 | 4 | 60 | 4 |
| 25% | 484 | 90 | 305 | 70 |
| 50% | 1036 | 206 | 575 | 147 |
| 75% | 2117 | 403 | 1007 | 268 |
| max | 11661 | 3212 | 3108 | 1216 |
|
Diegulio/PetClassification | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': No detectado
'1': affenpinscher
'2': afghan_hound
'3': african_hunting_dog
'4': airedale
'5': american_staffordshire_terrier
'6': appenzeller
'7': australian_terrier
'8': basenji
'9': basset
'10': beagle
'11': bedlington_terrier
'12': bernese_mountain_dog
'13': black-and-tan_coonhound
'14': blenheim_spaniel
'15': bloodhound
'16': bluetick
'17': border_collie
'18': border_terrier
'19': borzoi
'20': boston_bull
'21': bouvier_des_flandres
'22': boxer
'23': brabancon_griffon
'24': briard
'25': brittany_spaniel
'26': bull_mastiff
'27': cairn
'28': cardigan
'29': chesapeake_bay_retriever
'30': chihuahua
'31': chow
'32': clumber
'33': cocker_spaniel
'34': collie
'35': curly-coated_retriever
'36': dandie_dinmont
'37': dhole
'38': dingo
'39': doberman
'40': english_foxhound
'41': english_setter
'42': english_springer
'43': entlebucher
'44': eskimo_dog
'45': flat-coated_retriever
'46': french_bulldog
'47': gato
'48': german_shepherd
'49': german_short-haired_pointer
'50': giant_schnauzer
'51': golden_retriever
'52': gordon_setter
'53': great_dane
'54': great_pyrenees
'55': greater_swiss_mountain_dog
'56': groenendael
'57': ibizan_hound
'58': irish_setter
'59': irish_terrier
'60': irish_water_spaniel
'61': irish_wolfhound
'62': italian_greyhound
'63': japanese_spaniel
'64': keeshond
'65': kelpie
'66': kerry_blue_terrier
'67': komondor
'68': kuvasz
'69': labrador_retriever
'70': lakeland_terrier
'71': leonberg
'72': lhasa
'73': malamute
'74': malinois
'75': maltese_dog
'76': mexican_hairless
'77': miniature_pinscher
'78': miniature_poodle
'79': miniature_schnauzer
'80': newfoundland
'81': norfolk_terrier
'82': norwegian_elkhound
'83': norwich_terrier
'84': old_english_sheepdog
'85': otterhound
'86': papillon
'87': pekinese
'88': pembroke
'89': pomeranian
'90': pug
'91': redbone
'92': rhodesian_ridgeback
'93': rottweiler
'94': saint_bernard
'95': saluki
'96': samoyed
'97': schipperke
'98': scotch_terrier
'99': scottish_deerhound
'100': sealyham_terrier
'101': shetland_sheepdog
'102': shih-tzu
'103': siberian_husky
'104': silky_terrier
'105': soft-coated_wheaten_terrier
'106': staffordshire_bullterrier
'107': standard_poodle
'108': standard_schnauzer
'109': sussex_spaniel
'110': tibetan_mastiff
'111': tibetan_terrier
'112': toy_poodle
'113': toy_terrier
'114': vizsla
'115': walker_hound
'116': weimaraner
'117': welsh_springer_spaniel
'118': west_highland_white_terrier
'119': whippet
'120': wire-haired_fox_terrier
'121': yorkshire_terrier
splits:
- name: train
num_bytes: 344179685.94
num_examples: 7499
- name: validation
num_bytes: 29205702.0
num_examples: 834
- name: test
num_bytes: 81732756.983
num_examples: 2083
download_size: 379294077
dataset_size: 455118144.923
---
# Dataset Card for "PetClassification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrachilles/NTU60PointsRefined | ---
license: mit
---
|
nbtpj/td_qfs | ---
dataset_info:
features:
- name: cluster
dtype: string
- name: documents
sequence: string
- name: query_summ
list:
- name: query
dtype: string
- name: summ
dtype: string
splits:
- name: train
num_bytes: 4347725
num_examples: 4
download_size: 559120
dataset_size: 4347725
---
# Dataset Card for "td_qfs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shachardon/ShareLM | ---
license: mit
task_categories:
- text-generation
language:
- en
pretty_name: ShareLM
size_categories:
- 1M<n<10M
configs:
- config_name: default
data_files:
- split: train
path:
- "collective_cognition_formatted.json"
- "hh_rlhf_formatted.json"
- "babi_formatted.json"
- "self_feeding_formatted.json"
---
# Dataset Card for ShareLM💬
<!-- Provide a quick summary of the dataset. -->
ShareLM collects and shares human-model interactions, in a unified format from various LLMs and platforms.
The Goal -> Collecting an ever-growing dataset of conversations, for the benefit of the open-source community 💬🥳
Whether you use models, create data, or spaces there is always a way to help!
# How to Contribute?
Want to contribute your own human-model interaction? This is exactly what the [ShareLM plugin](#what-is-the-sharelm-plugin) is for.
Have a human-model data that you want to share with the community? Great! You can contact us <a href="mailto:shareLM.project@gmail.com">here</a>.
If you have a model space, it can also share the data (with some thought on privacy first).
## What is the ShareLM plugin?
The ShareLM plugin is a Chrome extension that makes it easy for you to contribute your own human-model interactions.
The conversations are released here with the most permissive restriction allowed by the specific model.
## Unified Contributions
Great human-model interaction datasets that compose the ShareLM dataset:
- **ShareLM Plugin** https://chromewebstore.google.com/detail/sharelm-share-your-chat-c/nldoebkdaiidhceaphmipeclmlcbljmh
- **Collective Cognition** https://huggingface.co/datasets/CollectiveCognition/chats-data-2023-10-16?row=11
- **hh rlhf** https://huggingface.co/datasets/Anthropic/hh-rlhf
- **babi** https://github.com/facebookarchive/bAbI-tasks
- **self-feeding** https://parl.ai/projects/self_feeding/
Please see the links for the appropriate citations and license.
## Loading The Full Data
[Wildchat](https://huggingface.co/datasets/allenai/WildChat) and [LMSYS-Chat-1M](https://huggingface.co/datasets/lmsys/lmsys-chat-1m) are also great resources for human-model conversations.
Both are gated datasets, so in order to download them you will first need to conform their terms of use.
After doing so, you can use the following code to get the full data:
```python
import datasets
import pandas as pd
user_token = "Insert your HF token here"
ours = datasets.load_dataset("shachardon/ShareLM")["train"]
print(ours[0])
lmsys_dataset = datasets.load_dataset("lmsys/lmsys-chat-1m", token=user_token)
lmsys_dataset_train = lmsys_dataset["train"]
examples = []
for i in range(lmsys_dataset_train.shape[0]):
data = lmsys_dataset_train[i]
conv = data["conversation"]
user_msgs = []
bot_msgs = []
for reply in conv:
if reply['role'] == 'user':
user_msgs.append(reply['content'])
if reply['role'] == 'assistant':
bot_msgs.append(reply['content'])
example = {"conversation_id": data["conversation_id"], "bot_msgs": bot_msgs, "user_msgs": user_msgs,
"source": "https://huggingface.co/datasets/lmsys/lmsys-chat-1m", "model_name": data["model"],
"user_id": "", "user_metadata": {}, "timestamp": "", "conversation_metadata":
str({"language": data["language"], "redacted": str(data["redacted"])})}
examples.append(example)
lmsys_formatted_dataset = datasets.Dataset.from_pandas(pd.DataFrame(data=examples))
wildchat_dataset = datasets.load_dataset("allenai/WildChat", token=user_token)
wildchat_dataset_train = wildchat_dataset["train"]
examples = []
for i in range(wildchat_dataset_train.shape[0]):
data = wildchat_dataset_train[i]
conv = data["conversation"]
user_msgs = []
bot_msgs = []
for reply in conv:
if reply['role'] == 'user':
user_msgs.append(reply['content'])
if reply['role'] == 'assistant':
bot_msgs.append(reply['content'])
example = {"conversation_id": data["conversation_id"], "bot_msgs": bot_msgs, "user_msgs": user_msgs,
"source": "https://huggingface.co/datasets/allenai/WildChat", "model_name": data["model"],
"user_id": "", "user_metadata": {}, "timestamp": conv["timestamp"], "conversation_metadata":
str({"language": data["language"], "redacted": str(data["redacted"]), "toxic": str(data["toxic"])})}
examples.append(example)
wildchat_formatted_dataset = datasets.Dataset.from_pandas(pd.DataFrame(data=examples))
dataset_all = datasets.concatenate_datasets([ours, lmsys_formatted_dataset, wildchat_formatted_dataset])
```
## Dataset Format
- **conversation_id** a unique id for the conversation
- **bot_msgs** a list of strings, of all the model responses
- **user_msgs** a lists of strings, of all the human user responses
- **source** the origin dataset
- **model_name** the model that is used in the conversation
- **user_id** a unique user-id
- **user_metadata** demographic information about the user (such as age, location, etc.)
- **timestamp** timestamp of the conversation
- **conversation metadata** additional conversation information (such as rating, title of the conversation, etc.)
|
rizerphe/glaive-function-calling-v2-llama | ---
license: cc-by-sa-4.0
task_categories:
- text-generation
language:
- en
size_categories:
- 100K<n<1M
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 228119663
num_examples: 103091
download_size: 95393598
dataset_size: 228119663
pretty_name: a
---
# Glaive's Function Calling V2 for Llama2
[Glaive's Function Calling V2 dataset](https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2), formatted according to the Llama2 chat schema, with all the data that I wasn't able to automatically convert removed manually.
Adds a special `<function>` token. Here's an example prompt:
```
<s>[INST] <<SYS>>
<function>Available functions:
<function>{
"name": "generate_password",
"description": "Generate a random password with specified criteria",
"parameters": {
"type": "object",
"properties": {
"length": {
"type": "integer",
"description": "The length of the password"
},
"include_numbers": {
"type": "boolean",
"description": "Include numbers in the password"
},
"include_special_characters": {
"type": "boolean",
"description": "Include special characters in the password"
}
},
"required": [
"length"
]
}
}
<</SYS>>
I need a new password. Can you generate one for me? [/INST] Of course! How long would you like your password to be? And do you want it to include numbers and special characters?</s><s>[INST] I want it to be 12 characters long and yes, it should include both numbers and special characters. [/INST]<function>generate_password
{
"length": 12,
"include_numbers": true,
"include_special_characters": true
}</s><s>[INST] <function>{"password": "4#7gB6&9L1!0"} [/INST] Here is your new password: 4#7gB6&9L1!0. Please make sure to save it in a secure place.</s>
``` |
open-llm-leaderboard/details_saishf__West-Maid-7B | ---
pretty_name: Evaluation run of saishf/West-Maid-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [saishf/West-Maid-7B](https://huggingface.co/saishf/West-Maid-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saishf__West-Maid-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T10:56:41.095810](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__West-Maid-7B/blob/main/results_2024-02-13T10-56-41.095810.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511022834562965,\n\
\ \"acc_stderr\": 0.03201231022208247,\n \"acc_norm\": 0.6525220270281931,\n\
\ \"acc_norm_stderr\": 0.032663176772250904,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5100402157242979,\n\
\ \"mc2_stderr\": 0.015147927272675395\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491894,\n\
\ \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719339\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6742680740888269,\n\
\ \"acc_stderr\": 0.00467689886197891,\n \"acc_norm\": 0.8643696474805815,\n\
\ \"acc_norm_stderr\": 0.003416958591324802\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813822,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813822\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634285,\n \
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n\
\ \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n\
\ \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n\
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258165,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258165\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n\
\ \"acc_stderr\": 0.01615591072134177,\n \"acc_norm\": 0.37094972067039106,\n\
\ \"acc_norm_stderr\": 0.01615591072134177\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5100402157242979,\n\
\ \"mc2_stderr\": 0.015147927272675395\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.01062696452997185\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.623199393479909,\n \
\ \"acc_stderr\": 0.013347858757829154\n }\n}\n```"
repo_url: https://huggingface.co/saishf/West-Maid-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|arc:challenge|25_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|gsm8k|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hellaswag|10_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T10-56-41.095810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T10-56-41.095810.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- '**/details_harness|winogrande|5_2024-02-13T10-56-41.095810.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T10-56-41.095810.parquet'
- config_name: results
data_files:
- split: 2024_02_13T10_56_41.095810
path:
- results_2024-02-13T10-56-41.095810.parquet
- split: latest
path:
- results_2024-02-13T10-56-41.095810.parquet
---
# Dataset Card for Evaluation run of saishf/West-Maid-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saishf/West-Maid-7B](https://huggingface.co/saishf/West-Maid-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saishf__West-Maid-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T10:56:41.095810](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__West-Maid-7B/blob/main/results_2024-02-13T10-56-41.095810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511022834562965,
"acc_stderr": 0.03201231022208247,
"acc_norm": 0.6525220270281931,
"acc_norm_stderr": 0.032663176772250904,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5100402157242979,
"mc2_stderr": 0.015147927272675395
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491894,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719339
},
"harness|hellaswag|10": {
"acc": 0.6742680740888269,
"acc_stderr": 0.00467689886197891,
"acc_norm": 0.8643696474805815,
"acc_norm_stderr": 0.003416958591324802
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813822,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813822
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634285,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258165,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258165
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.01615591072134177,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.01615591072134177
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214963,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5100402157242979,
"mc2_stderr": 0.015147927272675395
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.01062696452997185
},
"harness|gsm8k|5": {
"acc": 0.623199393479909,
"acc_stderr": 0.013347858757829154
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Sao10K__Skadi-Mixtral-v1 | ---
pretty_name: Evaluation run of Sao10K/Skadi-Mixtral-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Skadi-Mixtral-v1](https://huggingface.co/Sao10K/Skadi-Mixtral-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Skadi-Mixtral-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T19:34:40.564320](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Skadi-Mixtral-v1/blob/main/results_2024-03-31T19-34-40.564320.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7209183498326044,\n\
\ \"acc_stderr\": 0.029958365105102327,\n \"acc_norm\": 0.724144557107484,\n\
\ \"acc_norm_stderr\": 0.030537772691247418,\n \"mc1\": 0.4479804161566707,\n\
\ \"mc1_stderr\": 0.017408513063422906,\n \"mc2\": 0.6043101718764624,\n\
\ \"mc2_stderr\": 0.01510287124564243\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6774744027303754,\n \"acc_stderr\": 0.013659980894277378,\n\
\ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.013374078615068738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6914957179844653,\n\
\ \"acc_stderr\": 0.004609320024893897,\n \"acc_norm\": 0.8765186217884884,\n\
\ \"acc_norm_stderr\": 0.003283165867631369\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.025447863825108604,\n\
\ \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.025447863825108604\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n\
\ \"acc_stderr\": 0.029514245964291766,\n \"acc_norm\": 0.8541666666666666,\n\
\ \"acc_norm_stderr\": 0.029514245964291766\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n\
\ \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n\
\ \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\
\ \"acc_stderr\": 0.04579639422070435,\n \"acc_norm\": 0.6140350877192983,\n\
\ \"acc_norm_stderr\": 0.04579639422070435\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.039215453124671215,\n\
\ \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.039215453124671215\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.49206349206349204,\n \"acc_stderr\": 0.025748065871673286,\n \"\
acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.025748065871673286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8548387096774194,\n \"acc_stderr\": 0.020039563628053286,\n \"\
acc_norm\": 0.8548387096774194,\n \"acc_norm_stderr\": 0.020039563628053286\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\"\
: 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880236,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880236\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530616,\n\
\ \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530616\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725198,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725198\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.819327731092437,\n \"acc_stderr\": 0.02499196496660076,\n \
\ \"acc_norm\": 0.819327731092437,\n \"acc_norm_stderr\": 0.02499196496660076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8935779816513761,\n \"acc_stderr\": 0.013221554674594372,\n \"\
acc_norm\": 0.8935779816513761,\n \"acc_norm_stderr\": 0.013221554674594372\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6157407407407407,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.6157407407407407,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.028380391147094702,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.028380391147094702\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476073,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476073\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709225,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709225\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.6517857142857143,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761012,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761012\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n\
\ \"acc_stderr\": 0.01745698787243618,\n \"acc_norm\": 0.9230769230769231,\n\
\ \"acc_norm_stderr\": 0.01745698787243618\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8914431673052363,\n\
\ \"acc_stderr\": 0.01112428317585119,\n \"acc_norm\": 0.8914431673052363,\n\
\ \"acc_norm_stderr\": 0.01112428317585119\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.021628077380196124,\n\
\ \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.021628077380196124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4681564245810056,\n\
\ \"acc_stderr\": 0.01668855341561221,\n \"acc_norm\": 0.4681564245810056,\n\
\ \"acc_norm_stderr\": 0.01668855341561221\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n\
\ \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n\
\ \"acc_stderr\": 0.022827317491059686,\n \"acc_norm\": 0.797427652733119,\n\
\ \"acc_norm_stderr\": 0.022827317491059686\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385714,\n\
\ \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385714\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.549645390070922,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.549645390070922,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.559322033898305,\n\
\ \"acc_stderr\": 0.012680037994097051,\n \"acc_norm\": 0.559322033898305,\n\
\ \"acc_norm_stderr\": 0.012680037994097051\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8161764705882353,\n \"acc_stderr\": 0.02352924218519311,\n\
\ \"acc_norm\": 0.8161764705882353,\n \"acc_norm_stderr\": 0.02352924218519311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7908496732026143,\n \"acc_stderr\": 0.016453399332279323,\n \
\ \"acc_norm\": 0.7908496732026143,\n \"acc_norm_stderr\": 0.016453399332279323\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.025991117672813296,\n\
\ \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.025991117672813296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789256,\n\
\ \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789256\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4479804161566707,\n\
\ \"mc1_stderr\": 0.017408513063422906,\n \"mc2\": 0.6043101718764624,\n\
\ \"mc2_stderr\": 0.01510287124564243\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6474601971190296,\n \
\ \"acc_stderr\": 0.013159909755930328\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Skadi-Mixtral-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|arc:challenge|25_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|gsm8k|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hellaswag|10_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T19-34-40.564320.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T19-34-40.564320.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- '**/details_harness|winogrande|5_2024-03-31T19-34-40.564320.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T19-34-40.564320.parquet'
- config_name: results
data_files:
- split: 2024_03_31T19_34_40.564320
path:
- results_2024-03-31T19-34-40.564320.parquet
- split: latest
path:
- results_2024-03-31T19-34-40.564320.parquet
---
# Dataset Card for Evaluation run of Sao10K/Skadi-Mixtral-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Skadi-Mixtral-v1](https://huggingface.co/Sao10K/Skadi-Mixtral-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Skadi-Mixtral-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T19:34:40.564320](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Skadi-Mixtral-v1/blob/main/results_2024-03-31T19-34-40.564320.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7209183498326044,
"acc_stderr": 0.029958365105102327,
"acc_norm": 0.724144557107484,
"acc_norm_stderr": 0.030537772691247418,
"mc1": 0.4479804161566707,
"mc1_stderr": 0.017408513063422906,
"mc2": 0.6043101718764624,
"mc2_stderr": 0.01510287124564243
},
"harness|arc:challenge|25": {
"acc": 0.6774744027303754,
"acc_stderr": 0.013659980894277378,
"acc_norm": 0.7013651877133106,
"acc_norm_stderr": 0.013374078615068738
},
"harness|hellaswag|10": {
"acc": 0.6914957179844653,
"acc_stderr": 0.004609320024893897,
"acc_norm": 0.8765186217884884,
"acc_norm_stderr": 0.003283165867631369
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7811320754716982,
"acc_stderr": 0.025447863825108604,
"acc_norm": 0.7811320754716982,
"acc_norm_stderr": 0.025447863825108604
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.029514245964291766,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.029514245964291766
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070435,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070435
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.039215453124671215,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.039215453124671215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.025748065871673286,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.025748065871673286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8548387096774194,
"acc_stderr": 0.020039563628053286,
"acc_norm": 0.8548387096774194,
"acc_norm_stderr": 0.020039563628053286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880236,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880236
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.02925290592725198,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.02925290592725198
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.819327731092437,
"acc_stderr": 0.02499196496660076,
"acc_norm": 0.819327731092437,
"acc_norm_stderr": 0.02499196496660076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8935779816513761,
"acc_stderr": 0.013221554674594372,
"acc_norm": 0.8935779816513761,
"acc_norm_stderr": 0.013221554674594372
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.6157407407407407,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094702,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094702
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476073,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476073
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709225,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709225
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6517857142857143,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.6517857142857143,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761012,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761012
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.01745698787243618,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.01745698787243618
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8914431673052363,
"acc_stderr": 0.01112428317585119,
"acc_norm": 0.8914431673052363,
"acc_norm_stderr": 0.01112428317585119
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.021628077380196124,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.021628077380196124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4681564245810056,
"acc_stderr": 0.01668855341561221,
"acc_norm": 0.4681564245810056,
"acc_norm_stderr": 0.01668855341561221
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.797427652733119,
"acc_stderr": 0.022827317491059686,
"acc_norm": 0.797427652733119,
"acc_norm_stderr": 0.022827317491059686
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.020263764996385714,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.020263764996385714
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.549645390070922,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.549645390070922,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.559322033898305,
"acc_stderr": 0.012680037994097051,
"acc_norm": 0.559322033898305,
"acc_norm_stderr": 0.012680037994097051
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8161764705882353,
"acc_stderr": 0.02352924218519311,
"acc_norm": 0.8161764705882353,
"acc_norm_stderr": 0.02352924218519311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7908496732026143,
"acc_stderr": 0.016453399332279323,
"acc_norm": 0.7908496732026143,
"acc_norm_stderr": 0.016453399332279323
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789256,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789256
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4479804161566707,
"mc1_stderr": 0.017408513063422906,
"mc2": 0.6043101718764624,
"mc2_stderr": 0.01510287124564243
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242912
},
"harness|gsm8k|5": {
"acc": 0.6474601971190296,
"acc_stderr": 0.013159909755930328
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
threite/Bundestag-v2 | ---
annotations_creators: []
language:
- de
language_creators:
- expert-generated
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: Bundestag-v2
size_categories:
- 100K<n<1M
source_datasets: []
tags: ['Bundestag', 'ParlSpeech']
task_categories:
- text-classification
task_ids:
- entity-linking-classification
---
# Dataset Card for Bundestag-v2
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- Homepage: https://doi.org/10.7910/DVN/L4OAKN
### Dataset Summary
This dataset was generated from the [ParlSpeech V2](https://doi.org/10.7910/DVN/L4OAKN) dataset. It contains speeches from the german parliament from 1990 until 2020 labelled with the party of the speaker.
### Supported Tasks
Text Classification
### Languages
German
## Dataset Structure
### Data Fields
- text: Transcript of the speech in german
- party: Party of the speaker
### Data Splits
- train
- validation
- test
## Dataset Creation
### Curation Rationale
Created to train a language model, which is able to classify speeches by party.
### Source Data
#### Initial Data Collection and Normalization
- [ParlSpeech V2](https://doi.org/10.7910/DVN/L4OAKN)
## Considerations for Using the Data
### Social Impact of Dataset
These are political speeches, therefor the content can be controversial and potentially harmful.
## Additional Information
### Licensing Information
[CCO 1.0](http://creativecommons.org/publicdomain/zero/1.0)
### Citation Information
Bibtex entry:
```
@data{DVN/L4OAKN_2020,
author = {Rauh, Christian and Schwalbach, Jan},
publisher = {Harvard Dataverse},
title = {{The ParlSpeech V2 data set: Full-text corpora of 6.3 million parliamentary speeches in the key legislative chambers of nine representative democracies}},
year = {2020},
version = {V1},
doi = {10.7910/DVN/L4OAKN},
url = {https://doi.org/10.7910/DVN/L4OAKN}
}
``` |
sngsfydy/DR_Grading | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
splits:
- name: train
num_bytes: 261501746.0
num_examples: 413
- name: test
num_bytes: 64805638.0
num_examples: 103
download_size: 0
dataset_size: 326307384.0
---
# Dataset Card for "DR_Grading"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FaalSa/data12 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 17310
num_examples: 1
- name: validation
num_bytes: 17790
num_examples: 1
- name: test
num_bytes: 18270
num_examples: 1
download_size: 12886
dataset_size: 53370
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
anan-2024/twitter_dataset_1713124479 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 225340
num_examples: 606
download_size: 129226
dataset_size: 225340
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
INSAIT-Institute/arc-challenge-bgeval | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 593735
num_examples: 1119
- name: test
num_bytes: 639866
num_examples: 1172
- name: validation
num_bytes: 166067
num_examples: 299
download_size: 647884
dataset_size: 1399668
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
tner/fin | ---
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: FIN
---
# Dataset Card for "tner/fin"
## Dataset Description
- **Repository:** [T-NER](https://github.com/asahi417/tner)
- **Paper:** [https://aclanthology.org/U15-1010.pdf](https://aclanthology.org/U15-1010.pdf)
- **Dataset:** FIN
- **Domain:** Financial News
- **Number of Entity:** 4
### Dataset Summary
FIN NER dataset formatted in a part of [TNER](https://github.com/asahi417/tner) project.
FIN dataset contains training (FIN5) and test (FIN3) only, so we randomly sample a half size of test instances from the training set to create validation set.
- Entity Types: `ORG`, `LOC`, `PER`, `MISC`
## Dataset Structure
### Data Instances
An example of `train` looks as follows.
```
{
"tags": [0, 0, 0, 0, 0, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
"tokens": ["1", ".", "1", ".", "4", "Borrower", "engages", "in", "criminal", "conduct", "or", "is", "involved", "in", "criminal", "activities", ";"]
}
```
### Label ID
The label2id dictionary can be found at [here](https://huggingface.co/datasets/tner/fin/raw/main/dataset/label.json).
```python
{
"O": 0,
"B-PER": 1,
"B-LOC": 2,
"B-ORG": 3,
"B-MISC": 4,
"I-PER": 5,
"I-LOC": 6,
"I-ORG": 7,
"I-MISC": 8
}
```
### Data Splits
| name |train|validation|test|
|---------|----:|---------:|---:|
|fin |1014 | 303| 150|
### Citation Information
```
@inproceedings{salinas-alvarado-etal-2015-domain,
title = "Domain Adaption of Named Entity Recognition to Support Credit Risk Assessment",
author = "Salinas Alvarado, Julio Cesar and
Verspoor, Karin and
Baldwin, Timothy",
booktitle = "Proceedings of the Australasian Language Technology Association Workshop 2015",
month = dec,
year = "2015",
address = "Parramatta, Australia",
url = "https://aclanthology.org/U15-1010",
pages = "84--90",
}
``` |
asaxena1990/dummyset2 | ---
license: cc-by-nc-sa-4.0
---
|
JestemKamil/NexiaBot-Dataset | ---
dataset_info:
features:
- name: conversation
struct:
- name: conversationId
dtype: int64
- name: conversationName
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 43467
num_examples: 144
download_size: 22650
dataset_size: 43467
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
OllieStanley/humaneval-mbpp-testgen-qa | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
splits:
- name: train
num_bytes: 304315
num_examples: 591
download_size: 0
dataset_size: 304315
---
# Dataset Card for "humaneval-mbpp-testgen-qa"
This dataset contains prompt-reply (question-answer) pairs where the prompt is to create a Python unit tests which tests for the functionality described in a specific docstring. The responses are then the generated unit tests. |
jenyag/repo-code-completion | ---
license: apache-2.0
dataset_info:
- config_name: alphabetical_composer_all_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 590554966
num_examples: 224
download_size: 236538429
dataset_size: 590554966
- config_name: alphabetical_composer_non_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 560157388
num_examples: 224
download_size: 226511858
dataset_size: 560157388
- config_name: alphabetical_composer_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 114370147
num_examples: 224
download_size: 22096586
dataset_size: 114370147
- config_name: file_length_composer_all_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 590554966
num_examples: 224
download_size: 239093262
dataset_size: 590554966
- config_name: file_length_composer_non_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 560157388
num_examples: 224
download_size: 228632512
dataset_size: 560157388
- config_name: file_length_composer_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 114370147
num_examples: 224
download_size: 22181715
dataset_size: 114370147
- config_name: function_class_mask_half_composer_all_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 316335006
num_examples: 224
download_size: 0
dataset_size: 316335006
- config_name: function_class_mask_half_composer_non_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 315664977
num_examples: 224
download_size: 127938122
dataset_size: 315664977
- config_name: function_class_mask_half_composer_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 101260211
num_examples: 224
download_size: 17862587
dataset_size: 101260211
- config_name: function_class_mask_one_composer_all_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 90116249
num_examples: 224
download_size: 13554986
dataset_size: 90116249
- config_name: function_class_mask_one_composer_non_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 105054619
num_examples: 224
download_size: 15624970
dataset_size: 105054619
- config_name: function_class_mask_one_composer_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 87046937
num_examples: 224
download_size: 12999652
dataset_size: 87046937
- config_name: half_memory_composer_all_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 334960024
num_examples: 224
download_size: 123799195
dataset_size: 334960024
- config_name: half_memory_composer_non_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 311325289
num_examples: 224
download_size: 115444406
dataset_size: 311325289
- config_name: half_memory_composer_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 99351776
num_examples: 224
download_size: 18008844
dataset_size: 99351776
- config_name: imports_first_composer_all_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 590554966
num_examples: 224
download_size: 236389259
dataset_size: 590554966
- config_name: imports_first_composer_non_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 560157388
num_examples: 224
download_size: 226465503
dataset_size: 560157388
- config_name: imports_first_composer_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 114370147
num_examples: 224
download_size: 22077336
dataset_size: 114370147
- config_name: naive_composer_all_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 590554966
num_examples: 224
download_size: 236382094
dataset_size: 590554966
- config_name: naive_composer_non_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 560157388
num_examples: 224
download_size: 226480268
dataset_size: 560157388
- config_name: naive_composer_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 114370147
num_examples: 224
download_size: 22084803
dataset_size: 114370147
- config_name: path_distance_composer_all_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 590554966
num_examples: 224
download_size: 236585246
dataset_size: 590554966
- config_name: path_distance_composer_non_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 560157388
num_examples: 224
download_size: 226460548
dataset_size: 560157388
- config_name: path_distance_composer_py_context
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 114370147
num_examples: 224
download_size: 22014753
dataset_size: 114370147
- config_name: function_class_mask_half_composer_all_context
data_files:
- split: test
path: data/function_class_mask_half_composer/all_context/test-*
- config_name: function_class_mask_half_composer_non_py_context
data_files:
- split: test
path: data/function_class_mask_half_composer/non_py_context/test-*
- config_name: function_class_mask_half_composer_py_context
data_files:
- split: test
path: data/function_class_mask_half_composer/py_context/test-*
- config_name: imports_first_composer_all_context
data_files:
- split: test
path: data/imports_first_composer/all_context/test-*
- config_name: imports_first_composer_non_py_context
data_files:
- split: test
path: data/imports_first_composer/non_py_context/test-*
- config_name: imports_first_composer_py_context
data_files:
- split: test
path: data/imports_first_composer/py_context/test-*
- config_name: alphabetical_composer_all_context
data_files:
- split: test
path: data/alphabetical_composer/all_context/test-*
- config_name: alphabetical_composer_non_py_context
data_files:
- split: test
path: data/alphabetical_composer/non_py_context/test-*
- config_name: alphabetical_composer_py_context
data_files:
- split: test
path: data/alphabetical_composer/py_context/test-*
- config_name: naive_composer_all_context
data_files:
- split: test
path: data/naive_composer/all_context/test-*
- config_name: naive_composer_non_py_context
data_files:
- split: test
path: data/naive_composer/non_py_context/test-*
- config_name: naive_composer_py_context
data_files:
- split: test
path: data/naive_composer/py_context/test-*
- config_name: path_distance_composer_all_context
data_files:
- split: test
path: data/path_distance_composer/all_context/test-*
- config_name: path_distance_composer_non_py_context
data_files:
- split: test
path: data/path_distance_composer/non_py_context/test-*
- config_name: path_distance_composer_py_context
data_files:
- split: test
path: data/path_distance_composer/py_context/test-*
default: True
- config_name: file_length_composer_all_context
data_files:
- split: test
path: data/file_length_composer/all_context/test-*
- config_name: file_length_composer_non_py_context
data_files:
- split: test
path: data/file_length_composer/non_py_context/test-*
- config_name: file_length_composer_py_context
data_files:
- split: test
path: data/file_length_composer/py_context/test-*
- config_name: half_memory_composer_all_context
data_files:
- split: test
path: data/half_memory_composer/all_context/test-*
- config_name: half_memory_composer_non_py_context
data_files:
- split: test
path: data/half_memory_composer/non_py_context/test-*
- config_name: half_memory_composer_py_context
data_files:
- split: test
path: data/half_memory_composer/py_context/test-*
- config_name: function_class_mask_one_composer_all_context
data_files:
- split: test
path: data/function_class_mask_one_composer/all_context/test-*
- config_name: function_class_mask_one_composer_non_py_context
data_files:
- split: test
path: data/function_class_mask_one_composer/non_py_context/test-*
- config_name: function_class_mask_one_composer_py_context
data_files:
- split: test
path: data/function_class_mask_one_composer/py_context/test-*
---
# Repository Level Code Completion Dataset for Evaluation
This is a dataset of repository snapshots before a commit where a python file has been added. One needs to complete added file with given content of repository composed in different ways.
## How to load the data
1. via [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.3/en/package_reference/loading_methods#datasets.load_dataset):
```
from datasets import load_dataset
data_files = # choose from the table below
dataset = load_dataset("jenyag/repo-code-completion", data_files=data_files, split="train")
```
#### Options for `data_files`:
| | **all_context** | **non_py_context** | **py_context** |
|----|----|----|----|
| **function class mask half composer** | data/function_class_mask_half_composer/all_context/test-* | data/function_class_mask_half_composer/non_py_context/test-* | data/function_class_mask_half_composer/py_context/test-* |
| **imports first composer** | data/imports_first_composer/all_context/test-* | data/imports_first_composer/non_py_context/test-* | data/imports_first_composer/py_context/test-* |
| **alphabetical composer** | data/alphabetical_composer/all_context/test-* | data/alphabetical_composer/non_py_context/test-* | data/alphabetical_composer/py_context/test-* |
| **naive composer** | data/naive_composer/all_context/test-* | data/naive_composer/non_py_context/test-* | data/naive_composer/py_context/test-* |
| **path distance composer** | data/path_distance_composer/all_context/test-* | data/path_distance_composer/non_py_context/test-* | data/path_distance_composer/py_context/test-* |
| **file length composer** | data/file_length_composer/all_context/test-* | data/file_length_composer/non_py_context/test-* | data/file_length_composer/py_context/test-* |
| **half memory composer** | data/half_memory_composer/all_context/test-* | data/half_memory_composer/non_py_context/test-* | data/half_memory_composer/py_context/test-* |
| **function class mask one composer** | data/function_class_mask_one_composer/all_context/test-* | data/function_class_mask_one_composer/non_py_context/test-* | data/function_class_mask_one_composer/py_context/test-* |
## How to get the full context for the specific line
```
for datapoint in dataset:
project_context = datapoint['project_context'] # The project context may be quite long
for file_context_dict, ground_truth in zip(datapoint['file_context'], datapoint['gt']):
file_context = file_context_dict['content']
full_context = project_context + file_context
```
|
LooksJuicy/ruozhiba | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
---
受[COIG-CQIA](https://huggingface.co/datasets/m-a-p/COIG-CQIA/blob/main/ruozhiba/ruozhiba_ruozhiba.jsonl)启发,构建类似数据集,但答案风格相对更简洁。
弱智吧精选问题数据来自[github](https://github.com/Leymore/ruozhiba/tree/main?tab=readme-ov-file)提供的[疑问句](https://docs.qq.com/sheet/DUlZ6aURhamdwb1RO?tab=BB08J2),调用GPT-4获取答案,并过滤掉明显拒答的回复。 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.