datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Mutonix/RefGPT-Code-bg | ---
license: apache-2.0
dataset_info:
features:
- name: dialogue
dtype: string
- name: reference
dtype: string
- name: language
dtype: string
- name: type
dtype: string
splits:
- name: en
num_bytes: 106344832.26735915
num_examples: 8848
- name: zh
num_bytes: 101753322.73345818
num_examples: 9597
download_size: 86625605
dataset_size: 208098155.00081733
task_categories:
- conversational
language:
- zh
- en
arxiv: https://arxiv.org/abs/2305.14994
size_categories:
- 10K<n<100K
---
# Dataset Card for RefGPT-Code-bg
## Dataset Description
- **Homepage:**
- **Repository:** [https://github.com/ziliwangnlp/RefGPT](https://github.com/ziliwangnlp/RefGPT)
- **Paper:** [https://arxiv.org/abs/2305.14994](https://arxiv.org/abs/2305.14994)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
<p align="center">
<a href="https://arxiv.org/abs/2305.14994"><b>[Paper] RefGPT</b></a> |
<a href="https://github.com/ziliwangnlp/RefGPT"><b>[Github] RefGPT</b></a>
</p>
RefGPT-Code is a dataset containing 76k multi-turn dialogues about programming with 37k English and 39k Chinese, which has covered most aspects of code usage scenarios and multiple types of programming languages. Both the English version and Chinese version use the public GitHub dataset on Google BiqQuery with no overlap in these two languages. RefGPT-Code has derived various ways of leveraging the program code as the reference to enable different scenarios. We consider three perspectives of code discussion, code creation and bug fixing in RefGPT-Code.
**RefGPT-Code-bg** is the "bug fixing" subset.
### Supported Tasks and Leaderboards
Chatbot instruction finetuning
### Languages
Chinese, English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
Please pay attention that RefGPT Datasets, including RefGPT-Fact and RefGPT-Code, have not undergone manual verification, and as such, their security cannot be strictly guaranteed. Users should be aware that they are responsible for the results generated using this data.
### Discussion of Biases
As the datasets RefGPT-Fact and RefGPT-Code are collected by using the references like Wikipedia and Github repositories, it can not be avoided that the reference itself has factual errors, typos, or bugs and malicious code if it is from Github repositories. The datasets may also reflect the biases of the selected references and GPT-3.5/GPT-4 model
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@misc{yang2023refgpt,
title={RefGPT: Reference -> Truthful & Customized Dialogues Generation by GPTs and for GPTs},
author={Dongjie Yang and Ruifeng Yuan and YuanTao Fan and YiFei Yang and Zili Wang and Shusen Wang and Hai Zhao},
year={2023},
eprint={2305.14994},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
[More Information Needed] |
Mathews/DatasetTuning | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3142
num_examples: 16
download_size: 3295
dataset_size: 3142
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kasvii/face-partuv2beautifulluv-controluv-ffhq10-samples | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
- name: control_image
dtype: image
splits:
- name: train
num_bytes: 7315417.0
num_examples: 10
download_size: 4408703
dataset_size: 7315417.0
---
# Dataset Card for "face-partuv2beautifulluv-controluv-ffhq10-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KoboldAI__OPT-350M-Nerys-v2 | ---
pretty_name: Evaluation run of KoboldAI/OPT-350M-Nerys-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/OPT-350M-Nerys-v2](https://huggingface.co/KoboldAI/OPT-350M-Nerys-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__OPT-350M-Nerys-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T16:22:23.406290](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__OPT-350M-Nerys-v2/blob/main/results_2023-10-21T16-22-23.406290.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298392,\n \"f1\": 0.041601300335570565,\n\
\ \"f1_stderr\": 0.001164099674986064,\n \"acc\": 0.26150165183377183,\n\
\ \"acc_stderr\": 0.008156331616616547\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298392,\n\
\ \"f1\": 0.041601300335570565,\n \"f1_stderr\": 0.001164099674986064\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \
\ \"acc_stderr\": 0.0022675371022544935\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.516179952644041,\n \"acc_stderr\": 0.0140451261309786\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/OPT-350M-Nerys-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T16_22_23.406290
path:
- '**/details_harness|drop|3_2023-10-21T16-22-23.406290.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T16-22-23.406290.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T16_22_23.406290
path:
- '**/details_harness|gsm8k|5_2023-10-21T16-22-23.406290.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T16-22-23.406290.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T16_22_23.406290
path:
- '**/details_harness|winogrande|5_2023-10-21T16-22-23.406290.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T16-22-23.406290.parquet'
- config_name: results
data_files:
- split: 2023_10_21T16_22_23.406290
path:
- results_2023-10-21T16-22-23.406290.parquet
- split: latest
path:
- results_2023-10-21T16-22-23.406290.parquet
---
# Dataset Card for Evaluation run of KoboldAI/OPT-350M-Nerys-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/OPT-350M-Nerys-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/OPT-350M-Nerys-v2](https://huggingface.co/KoboldAI/OPT-350M-Nerys-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__OPT-350M-Nerys-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T16:22:23.406290](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__OPT-350M-Nerys-v2/blob/main/results_2023-10-21T16-22-23.406290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298392,
"f1": 0.041601300335570565,
"f1_stderr": 0.001164099674986064,
"acc": 0.26150165183377183,
"acc_stderr": 0.008156331616616547
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298392,
"f1": 0.041601300335570565,
"f1_stderr": 0.001164099674986064
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544935
},
"harness|winogrande|5": {
"acc": 0.516179952644041,
"acc_stderr": 0.0140451261309786
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
wujohns/gpt2-chitchat-learn | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 161238019
num_examples: 490001
- name: valid
num_bytes: 3190972
num_examples: 10000
download_size: 89438438
dataset_size: 164428991
---
# Dataset Card for "gpt2-chitchat-learn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
strombergnlp/twitter_pos | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- part-of-speech
paperswithcode_id: ritter-pos
pretty_name: Twitter Part-of-speech
---
# Dataset Card for "twitter-pos"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://gate.ac.uk/wiki/twitter-postagger.html](https://gate.ac.uk/wiki/twitter-postagger.html)
- **Repository:** [https://github.com/GateNLP/gateplugin-Twitter](https://github.com/GateNLP/gateplugin-Twitter)
- **Paper:** [https://aclanthology.org/R13-1026/](https://aclanthology.org/R13-1026/)
- **Point of Contact:** [Leon Derczynski](https://github.com/leondz)
- **Size of downloaded dataset files:** 51.96 MiB
- **Size of the generated dataset:** 251.22 KiB
- **Total amount of disk used:** 52.05 MB
### Dataset Summary
Part-of-speech information is basic NLP task. However, Twitter text
is difficult to part-of-speech tag: it is noisy, with linguistic errors and idiosyncratic style.
This dataset contains two datasets for English PoS tagging for tweets:
* Ritter, with train/dev/test
* Foster, with dev/test
Splits defined in the Derczynski paper, but the data is from Ritter and Foster.
* Ritter: [https://aclanthology.org/D11-1141.pdf](https://aclanthology.org/D11-1141.pdf),
* Foster: [https://www.aaai.org/ocs/index.php/ws/aaaiw11/paper/download/3912/4191](https://www.aaai.org/ocs/index.php/ws/aaaiw11/paper/download/3912/4191)
### Supported Tasks and Leaderboards
* [Part of speech tagging on Ritter](https://paperswithcode.com/sota/part-of-speech-tagging-on-ritter)
### Languages
English, non-region-specific. `bcp47:en`
## Dataset Structure
### Data Instances
An example of 'train' looks as follows.
```
{'id': '0', 'tokens': ['Antick', 'Musings', 'post', ':', 'Book-A-Day', '2010', '#', '243', '(', '10/4', ')', '--', 'Gray', 'Horses', 'by', 'Hope', 'Larson', 'http://bit.ly/as8fvc'], 'pos_tags': [23, 23, 22, 9, 23, 12, 22, 12, 5, 12, 6, 9, 23, 23, 16, 23, 23, 51]}
```
### Data Fields
The data fields are the same among all splits.
#### twitter-pos
- `id`: a `string` feature.
- `tokens`: a `list` of `string` features.
- `pos_tags`: a `list` of classification labels (`int`). Full tagset with indices:
```python
```
### Data Splits
| name |tokens|sentences|
|---------|----:|---------:|
|ritter train|10652|551|
|ritter dev |2242|118|
|ritter test |2291|118|
|foster dev |2998|270|
|foster test |2841|250|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
### Citation Information
```
@inproceedings{ritter2011named,
title={Named entity recognition in tweets: an experimental study},
author={Ritter, Alan and Clark, Sam and Etzioni, Oren and others},
booktitle={Proceedings of the 2011 conference on empirical methods in natural language processing},
pages={1524--1534},
year={2011}
}
@inproceedings{foster2011hardtoparse,
title={\# hardtoparse: POS Tagging and Parsing the Twitterverse},
author={Foster, Jennifer and Cetinoglu, Ozlem and Wagner, Joachim and Le Roux, Joseph and Hogan, Stephen and Nivre, Joakim and Hogan, Deirdre and Van Genabith, Josef},
booktitle={Workshops at the Twenty-Fifth AAAI Conference on Artificial Intelligence},
year={2011}
}
@inproceedings{derczynski2013twitter,
title={Twitter part-of-speech tagging for all: Overcoming sparse and noisy data},
author={Derczynski, Leon and Ritter, Alan and Clark, Sam and Bontcheva, Kalina},
booktitle={Proceedings of the international conference recent advances in natural language processing ranlp 2013},
pages={198--206},
year={2013}
}
```
### Contributions
Author uploaded ([@leondz](https://github.com/leondz)) |
halilbabacan/autotrain-data-cognitive_distortions | ---
task_categories:
- text-classification
tags:
- cognitive distortions
- psychology
---
# AutoTrain Dataset for project: cognitive_distortions
## Dataset Description
This dataset has been automatically processed by AutoTrain for project cognitive_distortions.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "I have had a lot of change happen this last year in every possible area of life but my thinking patterns just seem to be more prominent and I am pretty scared to think where they may lead",
"target": 0
},
{
"text": "He knows but my parents do not My family is Mormon but I am not and I don\u2019t want to disappoint my parents more than I already have",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['Distortion', 'No Distortion'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2821 |
| valid | 706 | |
0xHKG/solana_audit | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_eachadea__vicuna-13b | ---
pretty_name: Evaluation run of eachadea/vicuna-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eachadea/vicuna-13b](https://huggingface.co/eachadea/vicuna-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eachadea__vicuna-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T06:05:43.185046](https://huggingface.co/datasets/open-llm-leaderboard/details_eachadea__vicuna-13b/blob/main/results_2023-10-15T06-05-43.185046.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n\
\ \"em_stderr\": 0.00044451099905591266,\n \"f1\": 0.06103502516778559,\n\
\ \"f1_stderr\": 0.0014093219432847165,\n \"acc\": 0.3930771978723926,\n\
\ \"acc_stderr\": 0.01001987826540043\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905591266,\n\
\ \"f1\": 0.06103502516778559,\n \"f1_stderr\": 0.0014093219432847165\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \
\ \"acc_stderr\": 0.007291205723162591\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638271\n\
\ }\n}\n```"
repo_url: https://huggingface.co/eachadea/vicuna-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|arc:challenge|25_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T06_05_43.185046
path:
- '**/details_harness|drop|3_2023-10-15T06-05-43.185046.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T06-05-43.185046.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T06_05_43.185046
path:
- '**/details_harness|gsm8k|5_2023-10-15T06-05-43.185046.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T06-05-43.185046.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hellaswag|10_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:25:52.300291.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T14:25:52.300291.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T14:25:52.300291.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T06_05_43.185046
path:
- '**/details_harness|winogrande|5_2023-10-15T06-05-43.185046.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T06-05-43.185046.parquet'
- config_name: results
data_files:
- split: 2023_07_18T14_25_52.300291
path:
- results_2023-07-18T14:25:52.300291.parquet
- split: 2023_10_15T06_05_43.185046
path:
- results_2023-10-15T06-05-43.185046.parquet
- split: latest
path:
- results_2023-10-15T06-05-43.185046.parquet
---
# Dataset Card for Evaluation run of eachadea/vicuna-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/eachadea/vicuna-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [eachadea/vicuna-13b](https://huggingface.co/eachadea/vicuna-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eachadea__vicuna-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T06:05:43.185046](https://huggingface.co/datasets/open-llm-leaderboard/details_eachadea__vicuna-13b/blob/main/results_2023-10-15T06-05-43.185046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905591266,
"f1": 0.06103502516778559,
"f1_stderr": 0.0014093219432847165,
"acc": 0.3930771978723926,
"acc_stderr": 0.01001987826540043
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905591266,
"f1": 0.06103502516778559,
"f1_stderr": 0.0014093219432847165
},
"harness|gsm8k|5": {
"acc": 0.0758150113722517,
"acc_stderr": 0.007291205723162591
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638271
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Cheetor1996/Mana_Yu-Gi-Oh | ---
license: cc-by-2.0
language:
- en
tags:
- art
---
**Mana** from **Yu-Gi-Oh! Duel Monsters**
- *Trained with anime (full-final-pruned) model.*
- *5 versions; 6, 7, 8, 9, and 10 epochs*
- *Works well with ALL, MIDD, OUTD, and OUTALL LoRA weight blocks (But I highly recommend OUTD and OUTALL for more accurate results, especially with "10 epochs" version.)*
- *Try with 0.7+ weights (more recommendable up to 0.8)* |
albertvillanova/yaml-no | ---
license: apache-2.0
tags:
- no
- 'no'
- "no"
--- |
open-llm-leaderboard/details_aisquared__dlite-v2-355m | ---
pretty_name: Evaluation run of aisquared/dlite-v2-355m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aisquared/dlite-v2-355m](https://huggingface.co/aisquared/dlite-v2-355m) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v2-355m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T23:07:25.491864](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-355m/blob/main/results_2023-10-15T23-07-25.491864.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.000405845113241774,\n \"f1\": 0.055305159395973226,\n\
\ \"f1_stderr\": 0.001369522078512369,\n \"acc\": 0.26400947119179163,\n\
\ \"acc_stderr\": 0.007015202106702892\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.000405845113241774,\n\
\ \"f1\": 0.055305159395973226,\n \"f1_stderr\": 0.001369522078512369\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5280189423835833,\n\
\ \"acc_stderr\": 0.014030404213405784\n }\n}\n```"
repo_url: https://huggingface.co/aisquared/dlite-v2-355m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T23_07_25.491864
path:
- '**/details_harness|drop|3_2023-10-15T23-07-25.491864.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T23-07-25.491864.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T23_07_25.491864
path:
- '**/details_harness|gsm8k|5_2023-10-15T23-07-25.491864.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T23-07-25.491864.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:14:13.332045.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:14:13.332045.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:14:13.332045.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T23_07_25.491864
path:
- '**/details_harness|winogrande|5_2023-10-15T23-07-25.491864.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T23-07-25.491864.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_14_13.332045
path:
- results_2023-07-19T14:14:13.332045.parquet
- split: 2023_10_15T23_07_25.491864
path:
- results_2023-10-15T23-07-25.491864.parquet
- split: latest
path:
- results_2023-10-15T23-07-25.491864.parquet
---
# Dataset Card for Evaluation run of aisquared/dlite-v2-355m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/aisquared/dlite-v2-355m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-355m](https://huggingface.co/aisquared/dlite-v2-355m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v2-355m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T23:07:25.491864](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-355m/blob/main/results_2023-10-15T23-07-25.491864.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241774,
"f1": 0.055305159395973226,
"f1_stderr": 0.001369522078512369,
"acc": 0.26400947119179163,
"acc_stderr": 0.007015202106702892
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241774,
"f1": 0.055305159395973226,
"f1_stderr": 0.001369522078512369
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5280189423835833,
"acc_stderr": 0.014030404213405784
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mekaneeky/ateso-crowd-validated-paths | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Path
dtype: string
- name: Key
dtype: int64
- name: Speaker
dtype: string
- name: Transcription
dtype: string
splits:
- name: train
num_bytes: 691846
num_examples: 4829
- name: valid
num_bytes: 14470
num_examples: 100
- name: test
num_bytes: 13881
num_examples: 96
download_size: 274753
dataset_size: 720197
---
# Dataset Card for "ateso-crowd-validated-paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
esc-bench/esc-diagnostic-dataset | ---
annotations_creators:
- expert-generated
- crowdsourced
- machine-generated
language:
- en
language_creators:
- crowdsourced
- expert-generated
license:
- cc-by-4.0
- apache-2.0
- cc0-1.0
- cc-by-nc-3.0
- other
multilinguality:
- monolingual
pretty_name: ESC Diagnostic Dataset
size_categories:
- 100K<n<1M
- 1M<n<10M
source_datasets:
- original
- extended|librispeech_asr
- extended|common_voice
tags:
- asr
- benchmark
- speech
- esc
task_categories:
- automatic-speech-recognition
task_ids: []
extra_gated_prompt: |-
Three of the ESC datasets have specific terms of usage that must be agreed to before using the data.
To do so, fill in the access forms on the specific datasets' pages:
* Common Voice: https://huggingface.co/datasets/mozilla-foundation/common_voice_9_0
* GigaSpeech: https://huggingface.co/datasets/speechcolab/gigaspeech
* SPGISpeech: https://huggingface.co/datasets/kensho/spgispeech
extra_gated_fields:
I hereby confirm that I have registered on the original Common Voice page and agree to not attempt to determine the identity of speakers in the Common Voice dataset: checkbox
I hereby confirm that I have accepted the terms of usages on GigaSpeech page: checkbox
I hereby confirm that I have accepted the terms of usages on SPGISpeech page: checkbox
---
## ESC benchmark diagnostic dataset
## Dataset Summary
As a part of ESC benchmark, we provide a small, 8h diagnostic dataset of in-domain validation data with newly annotated transcriptions. The audio data is sampled from each of the ESC validation sets, giving a range of different domains and speaking styles. The transcriptions are annotated according to a consistent style guide with two formats: normalised and un-normalised. The dataset is structured in the same way as the ESC dataset, by grouping audio-transcription samples according to the dataset from which they were taken. We encourage participants to use this dataset when evaluating their systems to quickly assess performance on a range of different speech recognition conditions.
All eight datasets in ESC can be downloaded and prepared in just a single line of code through the Hugging Face Datasets library:
```python
from datasets import load_dataset
esc_diagnostic_ami = load_dataset("esc-benchmark/esc-diagnostic-dataset", "ami")
```
Datasets have two splits - `clean` nd `other`. To have clean diagnostic subset of AMI:
```python
ami_diagnostic_clean = esc_diagnostic_ami["clean"]
```
The datasets are full prepared, such that the audio and transcription files can be used directly in training/evaluation scripts.
## Dataset Information
A data point can be accessed by indexing the dataset object loaded through `load_dataset`:
```python
print(esc_diagnostic[0])
```
A typical data point comprises the path to the audio file and its transcription. Also included is information of the dataset from which the sample derives and a unique identifier name:
```python
{
'audio': {'path': None,
'array': array([ 7.01904297e-04, 7.32421875e-04, 7.32421875e-04, ...,
-2.74658203e-04, -1.83105469e-04, -3.05175781e-05]),
'sampling_rate': 16000},
'ortho_transcript': 'So, I guess we have to reflect on our experiences with remote controls to decide what, um, we would like to see in a convenient practical',
'norm_transcript': 'so i guess we have to reflect on our experiences with remote controls to decide what um we would like to see in a convenient practical',
'id': 'AMI_ES2011a_H00_FEE041_0062835_0064005',
'dataset': 'ami',
}
```
### Data Fields
- `audio`: a dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate.
- `ortho_transcript`: the orthographic transcription of the audio file.
- `norm_transcript`: the normalized transcription of the audio file.
- `id`: unique id of the data sample.
- `dataset`: string name of a dataset the sample belongs to.
### Data Preparation
#### Audio
The audio for all ESC datasets is segmented into sample lengths suitable for training ASR systems. The Hugging Face datasets library decodes audio files on the fly, reading the segments and converting them to a Python arrays. Consequently, no further preparation of the audio is required to be used in training/evaluation scripts.
Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, i.e. `dataset[0]["audio"]` should always be preferred over `dataset["audio"][0]`.
#### Transcriptions
The transcriptions corresponding to each audio file are provided in their 'error corrected' format. No transcription pre-processing is applied to the text, only necessary 'error correction' steps such as removing junk tokens (_<unk>_) or converting symbolic punctuation to spelled out form (_<comma>_ to _,_). As such, no further preparation of the transcriptions is required to be used in training/evaluation scripts.
Transcriptions are provided for training and validation splits. The transcriptions are **not** provided for the test splits. The ESC benchmark requires you to generate predictions for the test sets and upload them to https://huggingface.co/spaces/esc-benchmark/esc for scoring.
### Access
All eight of the datasets in ESC are accessible and licensing is freely available. Three of the ESC datasets have specific terms of usage that must be agreed to before using the data. To do so, fill in the access forms on the specific datasets' pages:
* Common Voice: https://huggingface.co/datasets/mozilla-foundation/common_voice_9_0
* GigaSpeech: https://huggingface.co/datasets/speechcolab/gigaspeech
* SPGISpeech: https://huggingface.co/datasets/kensho/spgispeech
### Diagnostic Dataset
ESC contains a small, 8h diagnostic dataset of in-domain validation data with newly annotated transcriptions. The audio data is sampled from each of the ESC validation sets, giving a range of different domains and speaking styles. The transcriptions are annotated according to a consistent style guide with two formats: normalised and un-normalised. The dataset is structured in the same way as the ESC dataset, by grouping audio-transcription samples according to the dataset from which they were taken. We encourage participants to use this dataset when evaluating their systems to quickly assess performance on a range of different speech recognition conditions. For more information, visit: [esc-bench/esc-diagnostic-dataset](https://huggingface.co/datasets/esc-bench/esc-diagnostic-datasets).
## LibriSpeech
The LibriSpeech corpus is a standard large-scale corpus for assessing ASR systems. It consists of approximately 1,000 hours of narrated audiobooks from the [LibriVox](https://librivox.org) project. It is licensed under CC-BY-4.0.
Example Usage:
```python
librispeech = load_dataset("esc-benchmark/esc-datasets", "librispeech")
```
Train/validation splits:
- `train` (combination of `train.clean.100`, `train.clean.360` and `train.other.500`)
- `validation.clean`
- `validation.other`
Test splits:
- `test.clean`
- `test.other`
Also available are subsets of the train split, which can be accessed by setting the `subconfig` argument:
```python
librispeech = load_dataset("esc-benchmark/esc-datasets", "librispeech", subconfig="clean.100")
```
- `clean.100`: 100 hours of training data from the 'clean' subset
- `clean.360`: 360 hours of training data from the 'clean' subset
- `other.500`: 500 hours of training data from the 'other' subset
## Common Voice
Common Voice is a series of crowd-sourced open-licensed speech datasets where speakers record text from Wikipedia in various languages. The English subset of contains approximately 1,400 hours of audio data from speakers of various nationalities, accents and different recording conditions. It is licensed under CC0-1.0.
Example usage:
```python
common_voice = load_dataset("esc-benchmark/esc-datasets", "common_voice", use_auth_token=True)
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test`
## VoxPopuli
VoxPopuli s a large-scale multilingual speech corpus consisting of political data sourced from 2009-2020 European Parliament event recordings. The English subset contains approximately 550 hours of speech largely from non-native English speakers. It is licensed under CC0.
Example usage:
```python
voxpopuli = load_dataset("esc-benchmark/esc-datasets", "voxpopuli")
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test`
## TED-LIUM
TED-LIUM consists of English-language TED Talk conference videos covering a range of different cultural, political, and academic topics. It contains approximately 450 hours of transcribed speech data. It is licensed under CC-BY-NC-ND 3.0.
Example usage:
```python
tedlium = load_dataset("esc-benchmark/esc-datasets", "tedlium")
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test`
## GigaSpeech
GigaSpeech is a multi-domain English speech recognition corpus created from audiobooks, podcasts and YouTube. We provide the large train set (2,500 hours) and the standard validation and test splits. It is licensed under apache-2.0.
Example usage:
```python
gigaspeech = load_dataset("esc-benchmark/esc-datasets", "gigaspeech", use_auth_token=True)
```
Training/validation splits:
- `train` (`l` subset of training data (2,500 h))
- `validation`
Test splits:
- `test`
Also available are subsets of the train split, which can be accessed by setting the `subconfig` argument:
```python
gigaspeech = load_dataset("esc-benchmark/esc-datasets", "spgispeech", subconfig="xs", use_auth_token=True)
```
- `xs`: extra-small subset of training data (10 h)
- `s`: small subset of training data (250 h)
- `m`: medium subset of training data (1,000 h)
- `xl`: extra-large subset of training data (10,000 h)
## SPGISpeech
SPGISpeech consists of company earnings calls that have been manually transcribed by S&P Global, Inc according to a professional style guide. We provide the large train set (5,000 hours) and the standard validation and test splits. It is licensed under a Kensho user agreement.
Loading the dataset requires authorization.
Example usage:
```python
spgispeech = load_dataset("esc-benchmark/esc-datasets", "spgispeech", use_auth_token=True)
```
Training/validation splits:
- `train` (`l` subset of training data (~5,000 h))
- `validation`
Test splits:
- `test`
Also available are subsets of the train split, which can be accessed by setting the `subconfig` argument:
```python
spgispeech = load_dataset("esc-benchmark/esc-datasets", "spgispeech", subconfig="s", use_auth_token=True)
```
- `s`: small subset of training data (~200 h)
- `m`: medium subset of training data (~1,000 h)
## Earnings-22
Earnings-22 is a 119-hour corpus of English-language earnings calls collected from global companies, with speakers of many different nationalities and accents. It is licensed under CC-BY-SA-4.0.
Example usage:
```python
earnings22 = load_dataset("esc-benchmark/esc-datasets", "earnings22")
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test`
## AMI
The AMI Meeting Corpus consists of 100 hours of meeting recordings from multiple recording devices synced to a common timeline. It is licensed under CC-BY-4.0.
Example usage:
```python
ami = load_dataset("esc-benchmark/esc-datasets", "ami")
```
Training/validation splits:
- `train`
- `validation`
Test splits:
- `test` |
LexiconShiftInnovations/SinhalaCorpusLarge | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5219872462
num_examples: 13493126
download_size: 2179039811
dataset_size: 5219872462
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
language:
- si
tags:
- Sinhalacorpus
- sinhalallm
- sinhaladataset
- sinhalatrainin
- webscraped
size_categories:
- 10M<n<100M
--- |
katarinagresova/Genomic_Benchmarks_demo_coding_vs_intergenomic_seqs | ---
dataset_info:
features:
- name: seq
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 15900000
num_examples: 75000
- name: test
num_bytes: 5300000
num_examples: 25000
download_size: 2456511
dataset_size: 21200000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "Genomic_Benchmarks_demo_coding_vs_intergenomic_seqs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cj-mills/cvat-instance-segmentation-toy-dataset | ---
license: mit
---
|
CyberHarem/kolulu_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kolulu (Granblue Fantasy)
This is the dataset of kolulu (Granblue Fantasy), containing 45 images and their tags.
The core tags of this character are `green_hair, hair_over_one_eye, long_hair, dark_skin, dark-skinned_female, green_eyes, hat, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 45 | 75.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 45 | 39.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 111 | 86.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 45 | 64.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 111 | 130.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kolulu_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kolulu_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, solo, navel, hat_flower, outdoors, beach, looking_at_viewer, open_mouth, smile, white_bikini, blush, holding, ocean, sky, day, food |
| 1 | 7 |  |  |  |  |  | 1girl, bandaged_leg, solo, looking_at_viewer, navel, cloak, holding, staff, closed_mouth, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | navel | hat_flower | outdoors | beach | looking_at_viewer | open_mouth | smile | white_bikini | blush | holding | ocean | sky | day | food | bandaged_leg | cloak | staff | closed_mouth | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:-----------|:--------|:--------------------|:-------------|:--------|:---------------|:--------|:----------|:--------|:------|:------|:-------|:---------------|:--------|:--------|:---------------|:--------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | | | X | | | | | X | | | | | X | X | X | X | X |
|
AdapterOcean/datasci-standardized_cluster_0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 13439608
num_examples: 1294
download_size: 4112832
dataset_size: 13439608
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "datasci-standardized_cluster_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lipi17/Building-Cracks-Merged | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
---
<div align="center">
<img width="640" alt="lipi17/building-cracks-merged" src="https://huggingface.co/datasets/lipi17/building-cracks-merged/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['crack', 'stairstep_crack']
```
### Number of Images
```json
{'test': 11, 'valid': 433, 'train': 947}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("lipi17/building-cracks-merged", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/lipi-deepaakshi-patnaik-ktyz8/merged-building-cracks/dataset/1](https://universe.roboflow.com/lipi-deepaakshi-patnaik-ktyz8/merged-building-cracks/dataset/1?ref=roboflow2huggingface)
### Citation
```
@misc{ merged-building-cracks_dataset,
title = { Merged-Building-Cracks Dataset },
type = { Open Source Dataset },
author = { Lipi Deepaakshi Patnaik },
howpublished = { \\url{ https://universe.roboflow.com/lipi-deepaakshi-patnaik-ktyz8/merged-building-cracks } },
url = { https://universe.roboflow.com/lipi-deepaakshi-patnaik-ktyz8/merged-building-cracks },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2023 },
month = { oct },
note = { visited on 2023-10-21 },
}
```
### License
MIT
### Dataset Summary
This dataset was exported via roboflow.com on October 21, 2023 at 12:21 PM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand and search unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
For state of the art Computer Vision training notebooks you can use with this dataset,
visit https://github.com/roboflow/notebooks
To find over 100k other datasets and pre-trained models, visit https://universe.roboflow.com
The dataset includes 1391 images.
Cracks are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 640x640 (Stretch)
No image augmentation techniques were applied.
|
pharaouk/biology_dataset_standardized_cluster_17 | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 324
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biology_dataset_standardized_cluster_17"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlfredPros/smart-contracts-instructions | ---
task_categories:
- question-answering
language:
- en
tags:
- code
- blockchain
- smart contract
- solidity
size_categories:
- 1K<n<10K
viewer: true
---
# Smart Contracts Instructions
A dataset containing 6,003 GPT-generated human instruction and Solidity source code data pairs.
GPT models used to make this data are GPT-3.5 turbo, GPT-3.5 turbo 16k context, and GPT-4. Solidity source codes are used from mwritescode's Slither Audited Smart Contracts (https://huggingface.co/datasets/mwritescode/slither-audited-smart-contracts).
Distributions of the GPT models used to make this dataset:
- GPT-3.5 Turbo: 5,276
- GPT-3.5 Turbo 16k Context: 678
- GPT-4: 49
Solidity source codes in this dataset has been processed to replace triple or more newline characters with double newline characters and delete "Submitted for verification at " comments.
# Example Usage
```py
from datasets import load_dataset
# Load dataset
dataset = load_dataset("AlfredPros/smart-contracts-instructions", split="train")
# Print the first row instruction
print(dataset["instruction"][0])
``` |
lumenwrites/gdquest-test | ---
dataset_info:
features:
- name: path
dtype: string
- name: sentence
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 155132.0
num_examples: 14
download_size: 165791
dataset_size: 155132.0
---
# Dataset Card for "gdquest-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
svakulenk0/qrecc | ---
pretty_name: QReCC
language_creators:
- expert-generated
- found
language:
- en
license:
- cc-by-3.0
multilinguality:
- monolingual
source_datasets:
- extended|natural_questions
- extended|quac
task_categories:
- question-answering
task_ids:
- open-domain-qa
---
# Dataset Card for QReCC: Question Rewriting in Conversational Context
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- [**Repository:**](https://github.com/apple/ml-qrecc)
- [**Paper:**](https://arxiv.org/pdf/2010.04898.pdf)
- [**Leaderboard:**](https://www.tira.io/task/scai-qrecc/dataset/scai-qrecc21-test-dataset-2021-07-20)
### Dataset Summary
QReCC (Question Rewriting in Conversational Context) is an end-to-end open-domain question answering dataset comprising of 14K conversations with 81K question-answer pairs. The goal of this dataset is to provide a challenging benchmark for end-to-end conversational question answering that includes the individual subtasks of question rewriting, passage retrieval and reading comprehension.
The task in QReCC is to find answers to conversational questions within a collection of 10M web pages split into 54M passages. Answers to questions in the same conversation may be distributed across several web pages.
The passage collection should be downloaded from [**Zenodo**](https://zenodo.org/record/5115890#.YaeD7C8RppR) (passages.zip)
### Supported Tasks and Leaderboards
`question-answering`
### Languages
English
## Dataset Structure
### Data Instances
An example from the data set looks as follows:
```
{
"Context": [
"What are the pros and cons of electric cars?",
"Some pros are: They're easier on the environment. Electricity is cheaper than gasoline. Maintenance is less frequent and less expensive. They're very quiet. You'll get tax credits. They can shorten your commute time. Some cons are: Most EVs have pretty short ranges. Recharging can take a while."
],
"Question": "Tell me more about Tesla",
"Rewrite": "Tell me more about Tesla the car company.",
"Answer": "Tesla Inc. is an American automotive and energy company based in Palo Alto, California. The company specializes in electric car manufacturing and, through its SolarCity subsidiary, solar panel manufacturing.",
"Answer_URL": "https://en.wikipedia.org/wiki/Tesla,_Inc.",
"Conversation_no": 74,
"Turn_no": 2,
"Conversation_source": "trec"
}
```
### Data Splits
- train: 63501
- test: 16451
## Dataset Creation
### Source Data
- QuAC
- TREC CAsT
- Natural Questions
## Additional Information
### Licensing Information
[CC BY-SA 3.0](http://creativecommons.org/licenses/by-sa/3.0/)
### Citation Information
```
@inproceedings{ qrecc,
title={Open-Domain Question Answering Goes Conversational via Question Rewriting},
author={Anantha, Raviteja and Vakulenko, Svitlana and Tu, Zhucheng and Longpre, Shayne and Pulman, Stephen and Chappidi, Srinivas},
booktitle={ NAACL },
year={2021}
}
``` |
mask-distilled-one-sec-cv12/chunk_97 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1233302768
num_examples: 242204
download_size: 1258110810
dataset_size: 1233302768
---
# Dataset Card for "chunk_97"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rick012/celeb-identities | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
0: Cristiano_Ronaldo
1: Jay_Z
2: Nicki_Minaj
3: Peter_Obi
4: Roger_Federer
5: Serena_Williams
splits:
- name: train
num_bytes: 195536.0
num_examples: 18
download_size: 193243
dataset_size: 195536.0
---
# Dataset Card for "celeb-identities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kristmh/mongoDB_testset_highest_high | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: text_clean
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 435062
num_examples: 780
- name: train
num_bytes: 3907220
num_examples: 6230
- name: validate
num_bytes: 467758
num_examples: 778
download_size: 2348096
dataset_size: 4810040
---
# Dataset Card for "mongoDB_testset_highest_high"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wildercb/PrivacyTextBooks1.2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9350570
num_examples: 155008
download_size: 5519977
dataset_size: 9350570
---
# Dataset Card for "PrivacyTextBooks1.2"
Training dataset containing text from privacy engineering text books and papers
from the following :
textbooks:
Breached! - D. Solove
Information privacy law - D. Solove
Privacy _ what everyone needs to know-Oxford University Press (2017) - Leslie P. Francis, John G. Francis
PrivacyEngineersManifesto
Protection and privacy in transitional times
William Stallings - Information Privacy Engineering and Privacy by Design_ Understanding privacy threats, technologies, and regulations (6 Dec 2019, Addison-Wesley Professional)
papers / talks:
Ive got nothing to hide - D. Solove
Understanding privacy - D. Solove
Right to privacy - Warren / Brandeis
A critical analysis of privacy by design strategies - Colesky
Stanford Philosophy Review - Privacy
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Daviddddddd1/TAUKADIAL-24 | ---
license: mit
---
|
TempoFunk/medium | ---
size_categories:
- 10K<n<100K
license: agpl-3.0
task_categories:
- text-to-video
language:
- en
pretty_name: Medium
---
curr. size: 53,081 videos
goal (todo): 100,000+ |
pavanmantha/fiqa | ---
license: apache-2.0
---
|
mask-distilled-one-sec-cv12/chunk_108 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1227172000
num_examples: 241000
download_size: 1252378289
dataset_size: 1227172000
---
# Dataset Card for "chunk_108"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_222 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1282522040.0
num_examples: 251870
download_size: 1310694151
dataset_size: 1282522040.0
---
# Dataset Card for "chunk_222"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rawkintrevo/sgc2c-w-gpt3-summaries | ---
dataset_info:
features:
- name: script
dtype: string
- name: synopsis
dtype: string
splits:
- name: train
num_bytes: 1289598
num_examples: 93
download_size: 679035
dataset_size: 1289598
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tingchih/multi_news_doc | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 558392265
num_examples: 44972
- name: validation
num_bytes: 68272432
num_examples: 5622
- name: test
num_bytes: 70032124
num_examples: 5622
download_size: 403220650
dataset_size: 696696821
---
# Dataset Card for "multi_news_doc"
# This is a copy from multi_news dataset
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
totally-not-an-llm/airoboros-textbook-gpt4-graded | ---
license: other
license_name: airoboros
license_link: LICENSE
---
Graded by gpt4-0314 with this prompt:
```
A textbook entry has been proposed that would be written following the instruction:
{instruction}
Rate the educational value of the proposal from 1-100 for a LLM trying to learn english, general knowledge, python coding, logic, reasoning, etc.
Simply give the numerical rating with no explanation.
```
Currently unfinished |
liuyanchen1015/MULTI_VALUE_cola_to_infinitive | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 212
num_examples: 3
- name: test
num_bytes: 274
num_examples: 4
- name: train
num_bytes: 3549
num_examples: 43
download_size: 8001
dataset_size: 4035
---
# Dataset Card for "MULTI_VALUE_cola_to_infinitive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/datasci-standardized_cluster_1_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1665954
num_examples: 687
download_size: 847238
dataset_size: 1665954
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "datasci-standardized_cluster_1_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/msmarco-passage_trec-dl-hard_fold2 | ---
pretty_name: '`msmarco-passage/trec-dl-hard/fold2`'
viewer: false
source_datasets: ['irds/msmarco-passage']
task_categories:
- text-retrieval
---
# Dataset Card for `msmarco-passage/trec-dl-hard/fold2`
The `msmarco-passage/trec-dl-hard/fold2` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/msmarco-passage#msmarco-passage/trec-dl-hard/fold2).
# Data
This dataset provides:
- `queries` (i.e., topics); count=10
- `qrels`: (relevance assessments); count=898
- For `docs`, use [`irds/msmarco-passage`](https://huggingface.co/datasets/irds/msmarco-passage)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/msmarco-passage_trec-dl-hard_fold2', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/msmarco-passage_trec-dl-hard_fold2', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Mackie2021DlHard,
title={How Deep is your Learning: the DL-HARD Annotated Deep Learning Dataset},
author={Iain Mackie and Jeffrey Dalton and Andrew Yates},
journal={ArXiv},
year={2021},
volume={abs/2105.07975}
}
@inproceedings{Bajaj2016Msmarco,
title={MS MARCO: A Human Generated MAchine Reading COmprehension Dataset},
author={Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, Mir Rosenberg, Xia Song, Alina Stoica, Saurabh Tiwary, Tong Wang},
booktitle={InCoCo@NIPS},
year={2016}
}
```
|
open-llm-leaderboard/details_Undi95__MLewd-L2-13B | ---
pretty_name: Evaluation run of Undi95/MLewd-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/MLewd-L2-13B](https://huggingface.co/Undi95/MLewd-L2-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MLewd-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T07:35:31.407630](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-13B/blob/main/results_2023-10-18T07-35-31.407630.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.012164429530201342,\n\
\ \"em_stderr\": 0.0011226072817372202,\n \"f1\": 0.09181417785234938,\n\
\ \"f1_stderr\": 0.0019450870531667406,\n \"acc\": 0.37384759088376845,\n\
\ \"acc_stderr\": 0.007756725366346258\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.012164429530201342,\n \"em_stderr\": 0.0011226072817372202,\n\
\ \"f1\": 0.09181417785234938,\n \"f1_stderr\": 0.0019450870531667406\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \
\ \"acc_stderr\": 0.003106901266499655\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.012406549466192861\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/MLewd-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|arc:challenge|25_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T07_35_31.407630
path:
- '**/details_harness|drop|3_2023-10-18T07-35-31.407630.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T07-35-31.407630.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T07_35_31.407630
path:
- '**/details_harness|gsm8k|5_2023-10-18T07-35-31.407630.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T07-35-31.407630.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hellaswag|10_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T05:06:12.728207.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T05:06:12.728207.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T07_35_31.407630
path:
- '**/details_harness|winogrande|5_2023-10-18T07-35-31.407630.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T07-35-31.407630.parquet'
- config_name: results
data_files:
- split: 2023_09_05T05_06_12.728207
path:
- results_2023-09-05T05:06:12.728207.parquet
- split: 2023_10_18T07_35_31.407630
path:
- results_2023-10-18T07-35-31.407630.parquet
- split: latest
path:
- results_2023-10-18T07-35-31.407630.parquet
---
# Dataset Card for Evaluation run of Undi95/MLewd-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/MLewd-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/MLewd-L2-13B](https://huggingface.co/Undi95/MLewd-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__MLewd-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T07:35:31.407630](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-13B/blob/main/results_2023-10-18T07-35-31.407630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.012164429530201342,
"em_stderr": 0.0011226072817372202,
"f1": 0.09181417785234938,
"f1_stderr": 0.0019450870531667406,
"acc": 0.37384759088376845,
"acc_stderr": 0.007756725366346258
},
"harness|drop|3": {
"em": 0.012164429530201342,
"em_stderr": 0.0011226072817372202,
"f1": 0.09181417785234938,
"f1_stderr": 0.0019450870531667406
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.003106901266499655
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.012406549466192861
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
aneeshas/toy-tla-data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 83501
num_examples: 50
- name: test
num_bytes: 28275
num_examples: 20
download_size: 86160
dataset_size: 111776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "toy-tla-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
semeru/text-code-CodeSummarization | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: validation
num_bytes: 23742716
num_examples: 104273
- name: test
num_bytes: 20824989
num_examples: 90908
download_size: 0
dataset_size: 44567705
---
# Dataset Card for "CS_finetuning"
## Reference
<pre><code>@article{Mastropaolo2022TransferLearningForCodeRelatedTasks
title={Using Transfer Learning for Code-Related Tasks},
author={Mastropaolo, Antonio and Cooper, Nathan and Nader Palacio, David and Scalabrino, Simone and
Poshyvanyk, Denys and Oliveto, Rocco and Bavota, Gabriele},
journal={arXiv preprint arXiv:2206.08574},
year={2022}
}</code></pre> |
breakend/nllb-multi-domain | ---
language:
- en
- ru
- ayr
- bho
- dyu
- fur
- wol
annotations_creators:
- found
language_creators:
- expert-generated
license:
- cc-by-sa-4.0
multilinguality:
- multilingual
- translation
pretty_name: nllb-multi-domain
size_categories:
- unknown
source_datasets:
- extended|flores
task_categories:
- conditional-text-generation
task_ids:
- machine-translation
paperswithcode_id: flores
---
# Dataset Card for NLLB Multi-Domain
## Table of Contents
- [Dataset Card for NLLB Multi-Domain](#dataset-card-for-nllb-multi-domain)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Home:** [Flores](https://github.com/facebookresearch/flores/tree/main/nllb_md)
- **Repository:** [Github](https://github.com/facebookresearch/flores/tree/main/nllb_md)
### Dataset Summary
NLLB Multi Domain is a set of professionally-translated sentences in News, Unscripted informal speech, and Health domains. It is designed to enable assessment of out-of-domain performance and to study domain adaptation for machine translation. Each domain has approximately 3000 sentences.
### Supported Tasks and Leaderboards
#### Multilingual Machine Translation
Refer to the [Dynabench leaderboard](https://dynabench.org/flores/Flores%20MT%20Evaluation%20(FULL)) for additional details on model evaluation on FLORES-101 in the context of the WMT2021 shared task on [Large-Scale Multilingual Machine Translation](http://www.statmt.org/wmt21/large-scale-multilingual-translation-task.html). Flores 200 is an extention of this.
### Languages
Language | FLORES-200 code
---|---
Central Aymara | ayr_Latn
Bhojpuri | bho_Deva
Dyula | dyu_Latn
Friulian | fur_Latn
Russian | rus_Cyrl
Wolof | wol_Latn
Use a hyphenated pairing to get two langauges in one datapoint (e.g., "eng_Latn-rus_Cyrl" will provide sentences in the format below).
## Dataset Structure
### Data Instances
See Dataset Viewer.
The text is provided as-in the original dataset, without further preprocessing or tokenization.
### Data Fields
- `id`: Row number for the data entry, starting at 1.
- `sentence`: The full sentence in the specific language (may have _lang for pairings)
- `domain`: The domain of the sentence.
### Dataset Creation
Please refer to the original article [No Language Left Behind: Scaling Human-Centered Machine Translation](https://arxiv.org/abs/2207.04672) for additional information on dataset creation.
## Additional Information
### Dataset Curators
See paper for details.
### Licensing Information
Licensed with Creative Commons Attribution Share Alike 4.0. License available [here](https://creativecommons.org/licenses/by-sa/4.0/).
### Citation Information
Please cite the authors if you use these corpora in your work:
```bibtex
@article{nllb2022,
author = {NLLB Team, Marta R. Costa-jussà, James Cross, Onur Çelebi, Maha Elbayad, Kenneth Heafield, Kevin Heffernan, Elahe Kalbassi, Janice Lam, Daniel Licht, Jean Maillard, Anna Sun, Skyler Wang, Guillaume Wenzek, Al Youngblood, Bapi Akula, Loic Barrault, Gabriel Mejia Gonzalez, Prangthip Hansanti, John Hoffman, Semarley Jarrett, Kaushik Ram Sadagopan, Dirk Rowe, Shannon Spruit, Chau Tran, Pierre Andrews, Necip Fazil Ayan, Shruti Bhosale, Sergey Edunov, Angela Fan, Cynthia Gao, Vedanuj Goswami, Francisco Guzmán, Philipp Koehn, Alexandre Mourachko, Christophe Ropers, Safiyyah Saleem, Holger Schwenk, Jeff Wang},
title = {No Language Left Behind: Scaling Human-Centered Machine Translation},
year = {2022}
}
```
Please also cite prior work that this dataset builds on:
```bibtex
@inproceedings{,
title={The FLORES-101 Evaluation Benchmark for Low-Resource and Multilingual Machine Translation},
author={Goyal, Naman and Gao, Cynthia and Chaudhary, Vishrav and Chen, Peng-Jen and Wenzek, Guillaume and Ju, Da and Krishnan, Sanjana and Ranzato, Marc'Aurelio and Guzm\'{a}n, Francisco and Fan, Angela},
year={2021}
}
```
```bibtex
@inproceedings{,
title={Two New Evaluation Datasets for Low-Resource Machine Translation: Nepali-English and Sinhala-English},
author={Guzm\'{a}n, Francisco and Chen, Peng-Jen and Ott, Myle and Pino, Juan and Lample, Guillaume and Koehn, Philipp and Chaudhary, Vishrav and Ranzato, Marc'Aurelio},
journal={arXiv preprint arXiv:1902.01382},
year={2019}
}
``` |
freshpearYoon/vr_train_free_37 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6433804484
num_examples: 10000
download_size: 1097997821
dataset_size: 6433804484
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/5e44347e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1332
dataset_size: 186
---
# Dataset Card for "5e44347e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alexator26/ostatok_face_stickers_cleared | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 70948720.0
num_examples: 114
download_size: 70951318
dataset_size: 70948720.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arbml/arastance | ---
dataset_info:
features:
- name: filename
dtype: string
- name: claim
dtype: string
- name: claim_url
dtype: string
- name: article
dtype: string
- name: stance
dtype:
class_label:
names:
0: Discuss
1: Disagree
2: Unrelated
3: Agree
- name: article_title
dtype: string
- name: article_url
dtype: string
splits:
- name: test
num_bytes: 5611165
num_examples: 646
- name: train
num_bytes: 29682402
num_examples: 2848
- name: validation
num_bytes: 7080226
num_examples: 569
download_size: 18033579
dataset_size: 42373793
---
# Dataset Card for "arastance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ChaoticNeutrals__Eris_PrimeV3-Vision-7B | ---
pretty_name: Evaluation run of ChaoticNeutrals/Eris_PrimeV3-Vision-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChaoticNeutrals/Eris_PrimeV3-Vision-7B](https://huggingface.co/ChaoticNeutrals/Eris_PrimeV3-Vision-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChaoticNeutrals__Eris_PrimeV3-Vision-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T00:36:51.722400](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Eris_PrimeV3-Vision-7B/blob/main/results_2024-03-22T00-36-51.722400.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.657691343561988,\n\
\ \"acc_stderr\": 0.031893777564086824,\n \"acc_norm\": 0.6579346610974509,\n\
\ \"acc_norm_stderr\": 0.032550547543808754,\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.7031592585649867,\n\
\ \"mc2_stderr\": 0.014701036265284115\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518827,\n\
\ \"acc_norm\": 0.7064846416382252,\n \"acc_norm_stderr\": 0.01330725044494111\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6999601672973511,\n\
\ \"acc_stderr\": 0.004573383672159082,\n \"acc_norm\": 0.8787094204341764,\n\
\ \"acc_norm_stderr\": 0.0032579745937899368\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253833,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253833\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323797,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323797\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.012756933382823696,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.012756933382823696\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.7031592585649867,\n\
\ \"mc2_stderr\": 0.014701036265284115\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.0104707964967811\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \
\ \"acc_stderr\": 0.012880360794851817\n }\n}\n```"
repo_url: https://huggingface.co/ChaoticNeutrals/Eris_PrimeV3-Vision-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-36-51.722400.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T00-36-51.722400.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- '**/details_harness|winogrande|5_2024-03-22T00-36-51.722400.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T00-36-51.722400.parquet'
- config_name: results
data_files:
- split: 2024_03_22T00_36_51.722400
path:
- results_2024-03-22T00-36-51.722400.parquet
- split: latest
path:
- results_2024-03-22T00-36-51.722400.parquet
---
# Dataset Card for Evaluation run of ChaoticNeutrals/Eris_PrimeV3-Vision-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChaoticNeutrals/Eris_PrimeV3-Vision-7B](https://huggingface.co/ChaoticNeutrals/Eris_PrimeV3-Vision-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChaoticNeutrals__Eris_PrimeV3-Vision-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T00:36:51.722400](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Eris_PrimeV3-Vision-7B/blob/main/results_2024-03-22T00-36-51.722400.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.657691343561988,
"acc_stderr": 0.031893777564086824,
"acc_norm": 0.6579346610974509,
"acc_norm_stderr": 0.032550547543808754,
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.7031592585649867,
"mc2_stderr": 0.014701036265284115
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.013678810399518827,
"acc_norm": 0.7064846416382252,
"acc_norm_stderr": 0.01330725044494111
},
"harness|hellaswag|10": {
"acc": 0.6999601672973511,
"acc_stderr": 0.004573383672159082,
"acc_norm": 0.8787094204341764,
"acc_norm_stderr": 0.0032579745937899368
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253833,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253833
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323797,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323797
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.02389187954195961,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.02389187954195961
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823696,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823696
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.7031592585649867,
"mc2_stderr": 0.014701036265284115
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.0104707964967811
},
"harness|gsm8k|5": {
"acc": 0.6770280515542078,
"acc_stderr": 0.012880360794851817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
multimodalart/matryoshka-diffusion-models-paper-examples | ---
license: mit
---
# Matryoshka Diffusion Models - paper examples
This dataset contains the 1024x1024 images included in the [Matryoshka Diffusion Models
](https://huggingface.co/papers/2310.15111) paper.
Arxiv: https://arxiv.org/abs/2310.15111
|
flaviolima/coringaaa | ---
license: openrail
---
|
chaoyi-wu/PMC-Inline | ---
license: apache-2.0
task_categories:
- text-generation
tags:
- biology
---
# PMC-Inline Dataset
- [PMC-Inline Dataset](#pmc-inline-dataset)
- [Daraset Structure](#dataset-structure)
- [Sample](#sample)
This is the text parts and the figure parts can be dowloaded from https://pan.baidu.com/s/1Src_rhXsaOFp8zJ_3zMFsQ?pwd=p3ne.
## Dataset Structure
**PMC-Inline** (PMC papers with inline figures).
We collect the cc lincense papers from pubmed central and remoce the bib, author info, table and iamge captions in the original paper xml files.
Based on the inline figure ref, we link back 11M images into the paper contexts.
Each paper is organized as a PMCxxxxxxx.json. ```xxxxxxx``` refers to the paper unique PMCid
-
## Sample
A json in dataset is organized as bellow,
| info | {"article-type": "research-article", "pmid": "17925856", "pmc": "PMC1999654", "publisher-id": "07-PONE-RA-01026R1", "doi": "10.1371/journal.pone.0001008"} |
| ------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------- |
| text | \nPredicting Spatial Patterns of Plant Recruitment Using Animal-Displacement Kernels\nFor plants ... |
| img_ref | [{"id": "pone-0001008-g001", "start": 9177, "end": 9185}, {"id": "pone-0001008-g001", "start": 10715, "end": 10723}, ...] | | | | |
Explanation to each key
- info: some info. about the paper, like paper type, pmid, pmc id and so on.
- text: a string whihc is the paper content.
- img_ref: a list which contains which image and where it is referred in the original paper. For example {"id": "pone-0001008-g001", "start": 9177, "end": 9185} denotes the fig pone-0001008-g001 have been metioned in the text string at index 9177-9185.
You can get the image form our PMC figure parts, and fig is named unified as ```PMCxxxxxxx_figid.jpg``` like ```PMC1999654_pone-0001008-g001.jpg```
Note that, our PMC figures are collected before PMC-Inline, and during the time window, some papers have been updated. Thus some figures may be missed in our figure base. |
CreatorPhan/AnhWiki | ---
language:
- vi
---
Wiki data set with 909258.
Include 5110 record |
huggingartists/gunna | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/gunna"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.343267 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/18e3833ac527a4bf14ddf2acef834795.640x640x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/gunna">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Gunna</div>
<a href="https://genius.com/artists/gunna">
<div style="text-align: center; font-size: 14px;">@gunna</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/gunna).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/gunna")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|567| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/gunna")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
ciempiess/librivox_spanish | ---
license: cc-by-sa-4.0
---
|
GalacticV/Jasmine | ---
license: openrail
---
|
open-llm-leaderboard/details_xdatasi__antares-7b-slovenian | ---
pretty_name: Evaluation run of xdatasi/antares-7b-slovenian
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xdatasi/antares-7b-slovenian](https://huggingface.co/xdatasi/antares-7b-slovenian)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xdatasi__antares-7b-slovenian\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T11:59:13.743883](https://huggingface.co/datasets/open-llm-leaderboard/details_xdatasi__antares-7b-slovenian/blob/main/results_2024-03-21T11-59-13.743883.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/xdatasi/antares-7b-slovenian
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|arc:challenge|25_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|gsm8k|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hellaswag|10_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-59-13.743883.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T11-59-13.743883.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- '**/details_harness|winogrande|5_2024-03-21T11-59-13.743883.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T11-59-13.743883.parquet'
- config_name: results
data_files:
- split: 2024_03_21T11_59_13.743883
path:
- results_2024-03-21T11-59-13.743883.parquet
- split: latest
path:
- results_2024-03-21T11-59-13.743883.parquet
---
# Dataset Card for Evaluation run of xdatasi/antares-7b-slovenian
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xdatasi/antares-7b-slovenian](https://huggingface.co/xdatasi/antares-7b-slovenian) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xdatasi__antares-7b-slovenian",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T11:59:13.743883](https://huggingface.co/datasets/open-llm-leaderboard/details_xdatasi__antares-7b-slovenian/blob/main/results_2024-03-21T11-59-13.743883.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
datahrvoje/twitter_dataset_1713012600 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24305
num_examples: 57
download_size: 13527
dataset_size: 24305
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/trec-spanish | ---
pretty_name: '`trec-spanish`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `trec-spanish`
The `trec-spanish` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/trec-spanish#trec-spanish).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=120,605
This dataset is used by: [`trec-spanish_trec3`](https://huggingface.co/datasets/irds/trec-spanish_trec3), [`trec-spanish_trec4`](https://huggingface.co/datasets/irds/trec-spanish_trec4)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/trec-spanish', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ..., 'marked_up_doc': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@misc{Rogers2000Spanish,
title={TREC Spanish LDC2000T51},
author={Rogers, Willie},
year={2000},
url={https://catalog.ldc.upenn.edu/LDC2000T51},
publisher={Linguistic Data Consortium}
}
```
|
michaelmallari/airbnb-usa-mt-bozeman | ---
license: mit
---
|
heliosprime/twitter_dataset_1713054449 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11478
num_examples: 26
download_size: 8699
dataset_size: 11478
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713054449"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marcosd59/mapa_curricular | ---
language:
- es
--- |
open-llm-leaderboard/details_cognitivecomputations__laserxtral | ---
pretty_name: Evaluation run of cognitivecomputations/laserxtral
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/laserxtral](https://huggingface.co/cognitivecomputations/laserxtral)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__laserxtral\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-15T20:05:55.052145](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__laserxtral/blob/main/results_2024-01-15T20-05-55.052145.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6512747160527517,\n\
\ \"acc_stderr\": 0.03203788592010298,\n \"acc_norm\": 0.6512676159278845,\n\
\ \"acc_norm_stderr\": 0.03269445186595507,\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.638014374039814,\n\
\ \"mc2_stderr\": 0.01550457171837664\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6697952218430034,\n \"acc_stderr\": 0.013743085603760424,\n\
\ \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238363\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6931886078470424,\n\
\ \"acc_stderr\": 0.004602279238122065,\n \"acc_norm\": 0.8675562636924915,\n\
\ \"acc_norm_stderr\": 0.0033827979075230284\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n\
\ \"acc_stderr\": 0.01318222261672088,\n \"acc_norm\": 0.8378033205619413,\n\
\ \"acc_norm_stderr\": 0.01318222261672088\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3877094972067039,\n\
\ \"acc_stderr\": 0.016295332328155807,\n \"acc_norm\": 0.3877094972067039,\n\
\ \"acc_norm_stderr\": 0.016295332328155807\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032205,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032205\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.638014374039814,\n\
\ \"mc2_stderr\": 0.01550457171837664\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625842\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \
\ \"acc_stderr\": 0.012652544133186141\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/laserxtral
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|arc:challenge|25_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|gsm8k|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hellaswag|10_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T20-05-55.052145.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T20-05-55.052145.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- '**/details_harness|winogrande|5_2024-01-15T20-05-55.052145.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-15T20-05-55.052145.parquet'
- config_name: results
data_files:
- split: 2024_01_15T20_05_55.052145
path:
- results_2024-01-15T20-05-55.052145.parquet
- split: latest
path:
- results_2024-01-15T20-05-55.052145.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/laserxtral
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/laserxtral](https://huggingface.co/cognitivecomputations/laserxtral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__laserxtral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T20:05:55.052145](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__laserxtral/blob/main/results_2024-01-15T20-05-55.052145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6512747160527517,
"acc_stderr": 0.03203788592010298,
"acc_norm": 0.6512676159278845,
"acc_norm_stderr": 0.03269445186595507,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.638014374039814,
"mc2_stderr": 0.01550457171837664
},
"harness|arc:challenge|25": {
"acc": 0.6697952218430034,
"acc_stderr": 0.013743085603760424,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238363
},
"harness|hellaswag|10": {
"acc": 0.6931886078470424,
"acc_stderr": 0.004602279238122065,
"acc_norm": 0.8675562636924915,
"acc_norm_stderr": 0.0033827979075230284
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.01318222261672088,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.01318222261672088
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3877094972067039,
"acc_stderr": 0.016295332328155807,
"acc_norm": 0.3877094972067039,
"acc_norm_stderr": 0.016295332328155807
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032205,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032205
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.638014374039814,
"mc2_stderr": 0.01550457171837664
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625842
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.012652544133186141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MohdShawezKhan/iits-english-hindi | ---
license: mit
dataset_info:
features:
- name: translation
struct:
- name: en
dtype: string
- name: hi
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 390618
num_examples: 1000
download_size: 209961
dataset_size: 390618
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/find_second_sent_train_50_eval_10_sentbefore | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 220505
num_examples: 170
- name: validation
num_bytes: 9071
num_examples: 10
download_size: 92636
dataset_size: 229576
---
# Dataset Card for "find_second_sent_train_50_eval_10_sentbefore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_mmlu_en_conf_llama_nearestscore_true_x | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 130579.0
num_examples: 250
download_size: 0
dataset_size: 130579.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_en_conf_llama_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit | ---
pretty_name: Evaluation run of Enno-Ai/vigogne2-enno-13b-sft-lora-4bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Enno-Ai/vigogne2-enno-13b-sft-lora-4bit](https://huggingface.co/Enno-Ai/vigogne2-enno-13b-sft-lora-4bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T10:29:28.223248](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit/blob/main/results_2023-10-23T10-29-28.223248.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.38370385906040266,\n\
\ \"em_stderr\": 0.00498003573381493,\n \"f1\": 0.4364649748322163,\n\
\ \"f1_stderr\": 0.004838389403253292,\n \"acc\": 0.3855253166488449,\n\
\ \"acc_stderr\": 0.006453825756692964\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.38370385906040266,\n \"em_stderr\": 0.00498003573381493,\n\
\ \"f1\": 0.4364649748322163,\n \"f1_stderr\": 0.004838389403253292\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.0010717793485492625\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836666\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Enno-Ai/vigogne2-enno-13b-sft-lora-4bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|arc:challenge|25_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T10_29_28.223248
path:
- '**/details_harness|drop|3_2023-10-23T10-29-28.223248.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T10-29-28.223248.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T10_29_28.223248
path:
- '**/details_harness|gsm8k|5_2023-10-23T10-29-28.223248.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T10-29-28.223248.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hellaswag|10_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-53-48.356901.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T14-53-48.356901.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T14-53-48.356901.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T10_29_28.223248
path:
- '**/details_harness|winogrande|5_2023-10-23T10-29-28.223248.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T10-29-28.223248.parquet'
- config_name: results
data_files:
- split: 2023_09_12T14_53_48.356901
path:
- results_2023-09-12T14-53-48.356901.parquet
- split: 2023_10_23T10_29_28.223248
path:
- results_2023-10-23T10-29-28.223248.parquet
- split: latest
path:
- results_2023-10-23T10-29-28.223248.parquet
---
# Dataset Card for Evaluation run of Enno-Ai/vigogne2-enno-13b-sft-lora-4bit
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Enno-Ai/vigogne2-enno-13b-sft-lora-4bit
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Enno-Ai/vigogne2-enno-13b-sft-lora-4bit](https://huggingface.co/Enno-Ai/vigogne2-enno-13b-sft-lora-4bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T10:29:28.223248](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__vigogne2-enno-13b-sft-lora-4bit/blob/main/results_2023-10-23T10-29-28.223248.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.38370385906040266,
"em_stderr": 0.00498003573381493,
"f1": 0.4364649748322163,
"f1_stderr": 0.004838389403253292,
"acc": 0.3855253166488449,
"acc_stderr": 0.006453825756692964
},
"harness|drop|3": {
"em": 0.38370385906040266,
"em_stderr": 0.00498003573381493,
"f1": 0.4364649748322163,
"f1_stderr": 0.004838389403253292
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492625
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836666
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
viber1/lawlingo-data-set | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 495818
num_examples: 933
download_size: 210574
dataset_size: 495818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NathanDrake/Education | ---
license: afl-3.0
---
|
msong/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: int64
- name: updated_at
dtype: int64
- name: due_on
dtype: int64
- name: closed_at
dtype: int64
- name: comments
sequence: string
- name: created_at
dtype: int64
- name: updated_at
dtype: int64
- name: closed_at
dtype: int64
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: body
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 10233851
num_examples: 3019
download_size: 0
dataset_size: 10233851
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ch08931/GabyFran | ---
license: openrail
---
|
gabrielaltay/hacdc-wikipedia | ---
license: cc-by-sa-3.0
---
|
Anderson1992/stan | ---
license: openrail
---
|
bh8648/split_dataset_15 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: page_num
dtype: int64
splits:
- name: train
num_bytes: 903499
num_examples: 212
download_size: 462496
dataset_size: 903499
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "split_dataset_15"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ConvexAI__Luminex-72B-v0.1 | ---
pretty_name: Evaluation run of ConvexAI/Luminex-72B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ConvexAI/Luminex-72B-v0.1](https://huggingface.co/ConvexAI/Luminex-72B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Luminex-72B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T05:01:06.344002](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Luminex-72B-v0.1/blob/main/results_2024-02-18T05-01-06.344002.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7281401822398715,\n\
\ \"acc_stderr\": 0.029809060666632293,\n \"acc_norm\": 0.7308207069290532,\n\
\ \"acc_norm_stderr\": 0.030397530796361143,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4184792162618759,\n\
\ \"mc2_stderr\": 0.015267964699867515\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3984641638225256,\n \"acc_stderr\": 0.014306946052735563,\n\
\ \"acc_norm\": 0.43430034129692835,\n \"acc_norm_stderr\": 0.01448470304885736\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6845249950209121,\n\
\ \"acc_stderr\": 0.004637550478007368,\n \"acc_norm\": 0.8665604461262697,\n\
\ \"acc_norm_stderr\": 0.0033935420742276542\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n\
\ \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7225433526011561,\n\
\ \"acc_stderr\": 0.03414014007044037,\n \"acc_norm\": 0.7225433526011561,\n\
\ \"acc_norm_stderr\": 0.03414014007044037\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7574468085106383,\n \"acc_stderr\": 0.02802022627120022,\n\
\ \"acc_norm\": 0.7574468085106383,\n \"acc_norm_stderr\": 0.02802022627120022\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.5701754385964912,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727772,\n\
\ \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727772\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6216931216931217,\n \"acc_stderr\": 0.024976954053155236,\n \"\
acc_norm\": 0.6216931216931217,\n \"acc_norm_stderr\": 0.024976954053155236\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8709677419354839,\n\
\ \"acc_stderr\": 0.019070889254792753,\n \"acc_norm\": 0.8709677419354839,\n\
\ \"acc_norm_stderr\": 0.019070889254792753\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6009852216748769,\n \"acc_stderr\": 0.034454876862647164,\n\
\ \"acc_norm\": 0.6009852216748769,\n \"acc_norm_stderr\": 0.034454876862647164\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.898989898989899,\n \"acc_stderr\": 0.02146973557605533,\n \"acc_norm\"\
: 0.898989898989899,\n \"acc_norm_stderr\": 0.02146973557605533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527026,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527026\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7461538461538462,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.7461538461538462,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936577,\n \
\ \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936577\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7899159663865546,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.7899159663865546,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5231788079470199,\n \"acc_stderr\": 0.04078093859163085,\n \"\
acc_norm\": 0.5231788079470199,\n \"acc_norm_stderr\": 0.04078093859163085\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9119266055045872,\n \"acc_stderr\": 0.012150743719481669,\n \"\
acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.012150743719481669\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080437,\n\
\ \"acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944842,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944842\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622804,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622804\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628125,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628125\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253876,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253876\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9169859514687101,\n\
\ \"acc_stderr\": 0.009866287394639541,\n \"acc_norm\": 0.9169859514687101,\n\
\ \"acc_norm_stderr\": 0.009866287394639541\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321628,\n\
\ \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5318435754189944,\n\
\ \"acc_stderr\": 0.016688553415612213,\n \"acc_norm\": 0.5318435754189944,\n\
\ \"acc_norm_stderr\": 0.016688553415612213\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7908496732026143,\n \"acc_stderr\": 0.023287685312334806,\n\
\ \"acc_norm\": 0.7908496732026143,\n \"acc_norm_stderr\": 0.023287685312334806\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n\
\ \"acc_stderr\": 0.023222756797435098,\n \"acc_norm\": 0.7877813504823151,\n\
\ \"acc_norm_stderr\": 0.023222756797435098\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157365,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157365\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6099290780141844,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.6099290780141844,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5723598435462842,\n\
\ \"acc_stderr\": 0.01263579992276585,\n \"acc_norm\": 0.5723598435462842,\n\
\ \"acc_norm_stderr\": 0.01263579992276585\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7647058823529411,\n \"acc_stderr\": 0.017160587235046345,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.017160587235046345\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784613,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784613\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.0261682213446623,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.0261682213446623\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4184792162618759,\n\
\ \"mc2_stderr\": 0.015267964699867515\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.01197494866770231\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7543593631539045,\n \
\ \"acc_stderr\": 0.011857183603902225\n }\n}\n```"
repo_url: https://huggingface.co/ConvexAI/Luminex-72B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|arc:challenge|25_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|arc:challenge|25_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|gsm8k|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|gsm8k|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hellaswag|10_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hellaswag|10_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T04-48-02.244720.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T05-01-06.344002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T05-01-06.344002.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- '**/details_harness|winogrande|5_2024-02-18T04-48-02.244720.parquet'
- split: 2024_02_18T05_01_06.344002
path:
- '**/details_harness|winogrande|5_2024-02-18T05-01-06.344002.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T05-01-06.344002.parquet'
- config_name: results
data_files:
- split: 2024_02_18T04_48_02.244720
path:
- results_2024-02-18T04-48-02.244720.parquet
- split: 2024_02_18T05_01_06.344002
path:
- results_2024-02-18T05-01-06.344002.parquet
- split: latest
path:
- results_2024-02-18T05-01-06.344002.parquet
---
# Dataset Card for Evaluation run of ConvexAI/Luminex-72B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/Luminex-72B-v0.1](https://huggingface.co/ConvexAI/Luminex-72B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__Luminex-72B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T05:01:06.344002](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Luminex-72B-v0.1/blob/main/results_2024-02-18T05-01-06.344002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7281401822398715,
"acc_stderr": 0.029809060666632293,
"acc_norm": 0.7308207069290532,
"acc_norm_stderr": 0.030397530796361143,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4184792162618759,
"mc2_stderr": 0.015267964699867515
},
"harness|arc:challenge|25": {
"acc": 0.3984641638225256,
"acc_stderr": 0.014306946052735563,
"acc_norm": 0.43430034129692835,
"acc_norm_stderr": 0.01448470304885736
},
"harness|hellaswag|10": {
"acc": 0.6845249950209121,
"acc_stderr": 0.004637550478007368,
"acc_norm": 0.8665604461262697,
"acc_norm_stderr": 0.0033935420742276542
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.03414014007044037,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.03414014007044037
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7574468085106383,
"acc_stderr": 0.02802022627120022,
"acc_norm": 0.7574468085106383,
"acc_norm_stderr": 0.02802022627120022
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727772,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727772
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6216931216931217,
"acc_stderr": 0.024976954053155236,
"acc_norm": 0.6216931216931217,
"acc_norm_stderr": 0.024976954053155236
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8709677419354839,
"acc_stderr": 0.019070889254792753,
"acc_norm": 0.8709677419354839,
"acc_norm_stderr": 0.019070889254792753
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6009852216748769,
"acc_stderr": 0.034454876862647164,
"acc_norm": 0.6009852216748769,
"acc_norm_stderr": 0.034454876862647164
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.02146973557605533,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.02146973557605533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527026,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527026
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7461538461538462,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.7461538461538462,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4703703703703704,
"acc_stderr": 0.030431963547936577,
"acc_norm": 0.4703703703703704,
"acc_norm_stderr": 0.030431963547936577
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7899159663865546,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.7899159663865546,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5231788079470199,
"acc_stderr": 0.04078093859163085,
"acc_norm": 0.5231788079470199,
"acc_norm_stderr": 0.04078093859163085
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9119266055045872,
"acc_stderr": 0.012150743719481669,
"acc_norm": 0.9119266055045872,
"acc_norm_stderr": 0.012150743719481669
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080437,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944842,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944842
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622804,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622804
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628125,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628125
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253876,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253876
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9169859514687101,
"acc_stderr": 0.009866287394639541,
"acc_norm": 0.9169859514687101,
"acc_norm_stderr": 0.009866287394639541
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7716763005780347,
"acc_stderr": 0.022598703804321628,
"acc_norm": 0.7716763005780347,
"acc_norm_stderr": 0.022598703804321628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5318435754189944,
"acc_stderr": 0.016688553415612213,
"acc_norm": 0.5318435754189944,
"acc_norm_stderr": 0.016688553415612213
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7908496732026143,
"acc_stderr": 0.023287685312334806,
"acc_norm": 0.7908496732026143,
"acc_norm_stderr": 0.023287685312334806
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.023222756797435098,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.023222756797435098
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157365,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6099290780141844,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.6099290780141844,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5723598435462842,
"acc_stderr": 0.01263579992276585,
"acc_norm": 0.5723598435462842,
"acc_norm_stderr": 0.01263579992276585
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.017160587235046345,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.017160587235046345
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784613,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784613
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.0261682213446623,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.0261682213446623
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4184792162618759,
"mc2_stderr": 0.015267964699867515
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.01197494866770231
},
"harness|gsm8k|5": {
"acc": 0.7543593631539045,
"acc_stderr": 0.011857183603902225
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
joey234/mmlu-machine_learning-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 51682
num_examples: 112
download_size: 29605
dataset_size: 51682
---
# Dataset Card for "mmlu-machine_learning-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/dataset_cards_with_long_context_embeddins | ---
dataset_info:
features:
- name: id
dtype: string
- name: lastModified
dtype: string
- name: tags
sequence: string
- name: author
dtype: string
- name: description
dtype: string
- name: citation
dtype: string
- name: likes
dtype: int64
- name: downloads
dtype: int64
- name: created
dtype: timestamp[us]
- name: card
dtype: string
- name: card_len
dtype: int64
- name: embeddings
sequence:
sequence: float32
splits:
- name: train
num_bytes: 334799613.62829113
num_examples: 64462
download_size: 128564352
dataset_size: 334799613.62829113
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dataset_cards_with_long_context_embeddins"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
md-nishat-008/SentMix-3L | ---
license: agpl-3.0
---
# SentMix-3L: A Bangla-English-Hindi Code-Mixed Dataset for Sentiment Analysis
**Publication**: *The First Workshop in South East Asian Language Processing Workshop under AACL-2023.*
**Read in [arXiv](https://arxiv.org/pdf/2310.18023.pdf)**
---
## 📖 Introduction
Code-mixing is a well-studied linguistic phenomenon when two or more languages are mixed in text or speech. Several datasets have been built with the goal of training computational models for code-mixing. Although it is very common to observe code-mixing with multiple languages, most datasets available contain code-mixed between only two languages. In this paper, we introduce **SentMix-3L**, a novel dataset for sentiment analysis containing code-mixed data between three languages: Bangla, English, and Hindi. We show that zero-shot prompting with GPT-3.5 outperforms all transformer-based models on SentMix-3L.
---
## 📊 Dataset Details
We introduce **SentMix-3L**, a novel three-language code-mixed test dataset with gold standard labels in Bangla-Hindi-English for the task of Sentiment Analysis, containing 1,007 instances.
> We are presenting this dataset exclusively as a test set due to the unique and specialized nature of the task. Such data is very difficult to gather and requires significant expertise to access. The size of the dataset, while limiting for training purposes, offers a high-quality testing environment with gold-standard labels that can serve as a benchmark in this domain.
---
## 📈 Dataset Statistics
| | **All** | **Bangla** | **English** | **Hindi** | **Other** |
|-------------------|---------|------------|-------------|-----------|-----------|
| Tokens | 89494 | 32133 | 5998 | 15131 | 36232 |
| Types | 19686 | 8167 | 1073 | 1474 | 9092 |
| Max. in instance | 173 | 62 | 20 | 47 | 93 |
| Min. in instance | 41 | 4 | 3 | 2 | 8 |
| Avg | 88.87 | 31.91 | 5.96 | 15.03 | 35.98 |
| Std Dev | 19.19 | 8.39 | 2.94 | 5.81 | 9.70 |
*The row 'Avg' represents the average number of tokens with its standard deviation in row 'Std Dev'.*
---
## 📉 Results
| **Models** | **Weighted F1 Score** |
|---------------|-----------------------|
| GPT 3.5 Turbo | **0.62** |
| XLM-R | 0.59 |
| BanglishBERT | 0.56 |
| mBERT | 0.56 |
| BERT | 0.55 |
| roBERTa | 0.54 |
| MuRIL | 0.54 |
| IndicBERT | 0.53 |
| DistilBERT | 0.53 |
| HindiBERT | 0.48 |
| HingBERT | 0.47 |
| BanglaBERT | 0.47 |
*Weighted F-1 score for different models: training on synthetic, testing on natural data.*
---
## 📝 Citation
If you utilize this dataset, kindly cite our paper.
```bibtex
@article{raihan2023sentmix,
title={SentMix-3L: A Bangla-English-Hindi Code-Mixed Dataset for Sentiment Analysis},
author={Raihan, Md Nishat and Goswami, Dhiman and Mahmud, Antara and Anstasopoulos, Antonios and Zampieri, Marcos},
journal={arXiv preprint arXiv:2310.18023},
year={2023}
}
|
LumiOpen/arc_challenge_nb | ---
license: apache-2.0
language:
- nb
---
norwegian machine translated arc_challenge |
shrutisingh/dataset_recommendation_mcq_mc | ---
license: apache-2.0
---
Task: MCQ with multiple correct answers.
Dataset: Recommendation of datasets to validate a research question.
This dataset is derived from the [DataFinder](https://aclanthology.org/2023.acl-long.573/) dataset. We curate the abstracts of each dataset from [PapersWithCode](https://paperswithcode.com/datasets).
Given is a short `query` discussing a research question, and keyphrases relevant the query.
The original training set of the DataFinder dataset has positive and negative candidates for each query, to train a contrastive model.
We objective is to convert the dataset into a MCQ question-answering task with multiple correct answers. We also add the abstracts from the research papers introducing the datasets so that context can be provided to the models.
To reproduce the construction of this dataset, please visit [https://github.com/shruti-singh/scidata_recommendation](https://github.com/shruti-singh/scidata_recommendation).
Please note that the query instances in this dataset have no intersection with the [`dataset_recommendation_mcq_sc`](https://huggingface.co/datasets/shrutisingh/dataset_recommendation_mcq_sc) dataset. [`dataset_recommendation_mcq_sc`](https://huggingface.co/datasets/shrutisingh/dataset_recommendation_mcq_sc) is a variant of this MCQ question-answering task with only single correct answer. |
felix-red-panda/pomological_images_cleaned | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 63067722.0
num_examples: 12
download_size: 63059008
dataset_size: 63067722.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pomological_images_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sanatbek/aspect-based-sentiment-analysis-uzbek | ---
task_categories:
- text-classification
language:
- uz
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files: aspect-based-sentiment-analysis-uzbek.parquet
--- |
TifinLab/amazigh_moroccan_asr | ---
license: cc
dataset_info:
features:
- name: audio
dtype: audio
- name: licence
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 28331088.216
num_examples: 2968
- name: test
num_bytes: 11976679.336
num_examples: 1272
download_size: 38019235
dataset_size: 40307767.552
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_who_which | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1177
num_examples: 5
- name: test
num_bytes: 5047
num_examples: 14
- name: train
num_bytes: 7063
num_examples: 24
download_size: 14721
dataset_size: 13287
---
# Dataset Card for "MULTI_VALUE_wnli_who_which"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vrushali/llama2 | ---
dataset_info:
features:
- name: QueryText
dtype: string
- name: KccAns
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 137336
num_examples: 792
download_size: 62499
dataset_size: 137336
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_arvindanand__ValidateAI-33B-slerp | ---
pretty_name: Evaluation run of arvindanand/ValidateAI-33B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [arvindanand/ValidateAI-33B-slerp](https://huggingface.co/arvindanand/ValidateAI-33B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arvindanand__ValidateAI-33B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T12:15:46.754568](https://huggingface.co/datasets/open-llm-leaderboard/details_arvindanand__ValidateAI-33B-slerp/blob/main/results_2024-04-10T12-15-46.754568.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39295856328198214,\n\
\ \"acc_stderr\": 0.03406418772246501,\n \"acc_norm\": 0.39843889983227476,\n\
\ \"acc_norm_stderr\": 0.03499234221764259,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476196,\n \"mc2\": 0.45660677334622446,\n\
\ \"mc2_stderr\": 0.01690873305006139\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2781569965870307,\n \"acc_stderr\": 0.013094469919538805,\n\
\ \"acc_norm\": 0.31143344709897613,\n \"acc_norm_stderr\": 0.013532472099850952\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3147779326827325,\n\
\ \"acc_stderr\": 0.0046347821561286105,\n \"acc_norm\": 0.36825333598884685,\n\
\ \"acc_norm_stderr\": 0.00481344861540443\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.362962962962963,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296559,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296559\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3849056603773585,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.3849056603773585,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.03773809990686936,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.03773809990686936\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.03156564682236785,\n\
\ \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.03156564682236785\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376893,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376893\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4096774193548387,\n\
\ \"acc_stderr\": 0.027976054915347364,\n \"acc_norm\": 0.4096774193548387,\n\
\ \"acc_norm_stderr\": 0.027976054915347364\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4090909090909091,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.4090909090909091,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048574,\n\
\ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048574\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176095,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176095\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n\
\ \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4935779816513762,\n \"acc_stderr\": 0.021435554820013074,\n \"\
acc_norm\": 0.4935779816513762,\n \"acc_norm_stderr\": 0.021435554820013074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4215686274509804,\n \"acc_stderr\": 0.03465868196380758,\n \"\
acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.03465868196380758\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.46835443037974683,\n \"acc_stderr\": 0.03248197400511075,\n \
\ \"acc_norm\": 0.46835443037974683,\n \"acc_norm_stderr\": 0.03248197400511075\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.45739910313901344,\n\
\ \"acc_stderr\": 0.03343577705583065,\n \"acc_norm\": 0.45739910313901344,\n\
\ \"acc_norm_stderr\": 0.03343577705583065\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806297,\n\
\ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806297\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n\
\ \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5339805825242718,\n \"acc_stderr\": 0.0493929144727348,\n\
\ \"acc_norm\": 0.5339805825242718,\n \"acc_norm_stderr\": 0.0493929144727348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.030882736974138646,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.030882736974138646\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4648786717752235,\n\
\ \"acc_stderr\": 0.017835798806290642,\n \"acc_norm\": 0.4648786717752235,\n\
\ \"acc_norm_stderr\": 0.017835798806290642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.015366860386397108,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.015366860386397108\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.43729903536977494,\n\
\ \"acc_stderr\": 0.02817391776176288,\n \"acc_norm\": 0.43729903536977494,\n\
\ \"acc_norm_stderr\": 0.02817391776176288\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3549382716049383,\n \"acc_stderr\": 0.026624152478845853,\n\
\ \"acc_norm\": 0.3549382716049383,\n \"acc_norm_stderr\": 0.026624152478845853\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320207,\n \
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320207\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31421121251629724,\n\
\ \"acc_stderr\": 0.011855911587048223,\n \"acc_norm\": 0.31421121251629724,\n\
\ \"acc_norm_stderr\": 0.011855911587048223\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2867647058823529,\n \"acc_stderr\": 0.02747227447323382,\n\
\ \"acc_norm\": 0.2867647058823529,\n \"acc_norm_stderr\": 0.02747227447323382\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.01972205893961806,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.01972205893961806\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.472636815920398,\n\
\ \"acc_stderr\": 0.03530235517334682,\n \"acc_norm\": 0.472636815920398,\n\
\ \"acc_norm_stderr\": 0.03530235517334682\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.037867207062342145,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.037867207062342145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476196,\n \"mc2\": 0.45660677334622446,\n\
\ \"mc2_stderr\": 0.01690873305006139\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5493291239147593,\n \"acc_stderr\": 0.01398392886904024\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/arvindanand/ValidateAI-33B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|arc:challenge|25_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|gsm8k|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hellaswag|10_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T12-15-46.754568.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T12-15-46.754568.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- '**/details_harness|winogrande|5_2024-04-10T12-15-46.754568.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T12-15-46.754568.parquet'
- config_name: results
data_files:
- split: 2024_04_10T12_15_46.754568
path:
- results_2024-04-10T12-15-46.754568.parquet
- split: latest
path:
- results_2024-04-10T12-15-46.754568.parquet
---
# Dataset Card for Evaluation run of arvindanand/ValidateAI-33B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [arvindanand/ValidateAI-33B-slerp](https://huggingface.co/arvindanand/ValidateAI-33B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_arvindanand__ValidateAI-33B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T12:15:46.754568](https://huggingface.co/datasets/open-llm-leaderboard/details_arvindanand__ValidateAI-33B-slerp/blob/main/results_2024-04-10T12-15-46.754568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.39295856328198214,
"acc_stderr": 0.03406418772246501,
"acc_norm": 0.39843889983227476,
"acc_norm_stderr": 0.03499234221764259,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476196,
"mc2": 0.45660677334622446,
"mc2_stderr": 0.01690873305006139
},
"harness|arc:challenge|25": {
"acc": 0.2781569965870307,
"acc_stderr": 0.013094469919538805,
"acc_norm": 0.31143344709897613,
"acc_norm_stderr": 0.013532472099850952
},
"harness|hellaswag|10": {
"acc": 0.3147779326827325,
"acc_stderr": 0.0046347821561286105,
"acc_norm": 0.36825333598884685,
"acc_norm_stderr": 0.00481344861540443
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296559,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296559
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3849056603773585,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.3849056603773585,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686936,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686936
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376893,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376893
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4096774193548387,
"acc_stderr": 0.027976054915347364,
"acc_norm": 0.4096774193548387,
"acc_norm_stderr": 0.027976054915347364
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048574,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048574
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176095,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176095
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4935779816513762,
"acc_stderr": 0.021435554820013074,
"acc_norm": 0.4935779816513762,
"acc_norm_stderr": 0.021435554820013074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.03465868196380758,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.03465868196380758
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.46835443037974683,
"acc_stderr": 0.03248197400511075,
"acc_norm": 0.46835443037974683,
"acc_norm_stderr": 0.03248197400511075
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.45739910313901344,
"acc_stderr": 0.03343577705583065,
"acc_norm": 0.45739910313901344,
"acc_norm_stderr": 0.03343577705583065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3282442748091603,
"acc_stderr": 0.04118438565806297,
"acc_norm": 0.3282442748091603,
"acc_norm_stderr": 0.04118438565806297
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.5339805825242718,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.5339805825242718,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030882736974138646,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030882736974138646
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4648786717752235,
"acc_stderr": 0.017835798806290642,
"acc_norm": 0.4648786717752235,
"acc_norm_stderr": 0.017835798806290642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397108,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397108
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.43729903536977494,
"acc_stderr": 0.02817391776176288,
"acc_norm": 0.43729903536977494,
"acc_norm_stderr": 0.02817391776176288
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3549382716049383,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.3549382716049383,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.027807990141320207,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.027807990141320207
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31421121251629724,
"acc_stderr": 0.011855911587048223,
"acc_norm": 0.31421121251629724,
"acc_norm_stderr": 0.011855911587048223
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2867647058823529,
"acc_stderr": 0.02747227447323382,
"acc_norm": 0.2867647058823529,
"acc_norm_stderr": 0.02747227447323382
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.01972205893961806,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.01972205893961806
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.472636815920398,
"acc_stderr": 0.03530235517334682,
"acc_norm": 0.472636815920398,
"acc_norm_stderr": 0.03530235517334682
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.037867207062342145,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.037867207062342145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476196,
"mc2": 0.45660677334622446,
"mc2_stderr": 0.01690873305006139
},
"harness|winogrande|5": {
"acc": 0.5493291239147593,
"acc_stderr": 0.01398392886904024
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_digitous__Javelin-GPTJ | ---
pretty_name: Evaluation run of digitous/Javelin-GPTJ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [digitous/Javelin-GPTJ](https://huggingface.co/digitous/Javelin-GPTJ) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_digitous__Javelin-GPTJ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T01:31:09.179674](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__Javelin-GPTJ/blob/main/results_2023-10-16T01-31-09.179674.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n\
\ \"em_stderr\": 0.0002964962989801232,\n \"f1\": 0.04767722315436259,\n\
\ \"f1_stderr\": 0.0011834240833723825,\n \"acc\": 0.3299344233062645,\n\
\ \"acc_stderr\": 0.008579096533320701\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801232,\n\
\ \"f1\": 0.04767722315436259,\n \"f1_stderr\": 0.0011834240833723825\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \
\ \"acc_stderr\": 0.0036816118940738727\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6416732438831886,\n \"acc_stderr\": 0.01347658117256753\n\
\ }\n}\n```"
repo_url: https://huggingface.co/digitous/Javelin-GPTJ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T01_31_09.179674
path:
- '**/details_harness|drop|3_2023-10-16T01-31-09.179674.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T01-31-09.179674.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T01_31_09.179674
path:
- '**/details_harness|gsm8k|5_2023-10-16T01-31-09.179674.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T01-31-09.179674.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:13:27.511337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:13:27.511337.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:13:27.511337.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T01_31_09.179674
path:
- '**/details_harness|winogrande|5_2023-10-16T01-31-09.179674.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T01-31-09.179674.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_13_27.511337
path:
- results_2023-07-19T14:13:27.511337.parquet
- split: 2023_10_16T01_31_09.179674
path:
- results_2023-10-16T01-31-09.179674.parquet
- split: latest
path:
- results_2023-10-16T01-31-09.179674.parquet
---
# Dataset Card for Evaluation run of digitous/Javelin-GPTJ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/digitous/Javelin-GPTJ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [digitous/Javelin-GPTJ](https://huggingface.co/digitous/Javelin-GPTJ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_digitous__Javelin-GPTJ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T01:31:09.179674](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__Javelin-GPTJ/blob/main/results_2023-10-16T01-31-09.179674.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.04767722315436259,
"f1_stderr": 0.0011834240833723825,
"acc": 0.3299344233062645,
"acc_stderr": 0.008579096533320701
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.04767722315436259,
"f1_stderr": 0.0011834240833723825
},
"harness|gsm8k|5": {
"acc": 0.01819560272934041,
"acc_stderr": 0.0036816118940738727
},
"harness|winogrande|5": {
"acc": 0.6416732438831886,
"acc_stderr": 0.01347658117256753
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EsakkiSundar/sundarpersonal | ---
license: mit
---
|
atutej/xnli_custom | ---
dataset_info:
config_name: hi
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 1299702
num_examples: 5010
- name: validation
num_bytes: 648568
num_examples: 2490
download_size: 609456
dataset_size: 1948270
configs:
- config_name: hi
data_files:
- split: test
path: hi/test-*
- split: validation
path: hi/validation-*
---
|
cahya/nusa-mt | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- en
- id
license:
- cc-by-2.0
multilinguality:
- translation
pretty_name: Dataset Collection for Indonesian Machine Translation
size_categories:
- unknown
source_datasets:
- original
task_categories:
- translation
- text-generation
---
# Nusa-MT
Dataset Collection for Indonesian Machine Translation. The dataset come from following sources:
- ELRC_2922
- GlobalVoices
- News-Commentary
- Tatoeba
- Tico-19
|
hpprc/tanaka-corpus | ---
dataset_info:
features:
- name: id
dtype: string
- name: ja
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 17758809
num_examples: 147876
download_size: 10012915
dataset_size: 17758809
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-4.0
task_categories:
- translation
language:
- ja
- en
pretty_name: tanaka-corpus
size_categories:
- 100K<n<1M
---
HF Datasets version of [Tanaka Corpus](http://www.edrdg.org/wiki/index.php/Tanaka_Corpus).
## Preprocess for HF Datasets
以下の内容でオリジナルデータを前処理しました。
```bash
wget ftp://ftp.edrdg.org/pub/Nihongo/examples.utf.gz
gunzip examples.utf.gz
```
```python
import re
from pathlib import Path
from more_itertools import chunked
import datasets as ds
data = []
with Path("examples.utf").open() as f:
for row, _ in chunked(f, 2):
ja, en, idx = re.findall(r"A: (.*?)\t(.*?)#ID=(.*$)", row)[0]
data.append(
{
"id": idx,
"ja": ja.strip(),
"en": en.strip(),
}
)
dataset = ds.Dataset.from_list(data)
dataset.push_to_hub("hpprc/tanaka-corpus")
``` |
ftopal/huggingface-models-processed | ---
dataset_info:
features:
- name: sha
dtype: 'null'
- name: last_modified
dtype: 'null'
- name: library_name
dtype: string
- name: text
dtype: string
- name: metadata
dtype: string
- name: pipeline_tag
dtype: string
- name: id
dtype: string
- name: tags
sequence: string
- name: created_at
dtype: string
- name: arxiv
sequence: string
- name: languages
sequence: string
- name: tags_str
dtype: string
- name: text_str
dtype: string
- name: text_lists
sequence: string
- name: processed_texts
sequence: string
- name: tokens_length
sequence: int64
- name: input_texts
sequence: string
splits:
- name: train
num_bytes: 1788749855
num_examples: 240530
download_size: 481626308
dataset_size: 1788749855
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hack90/ncbi_genbank_part_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 20552040218
num_examples: 10205
download_size: 6137836807
dataset_size: 20552040218
---
# Dataset Card for "ncbi_genbank_part_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mlsquare/SERVER_samantar_mixed_val | ---
dataset_info:
features:
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 1019361.24
num_examples: 7920
download_size: 682209
dataset_size: 1019361.24
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SERVER_samantar_mixed_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuvalkirstain/beautiful_interesting_spectacular_photo_futuristic_25000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: pclean
dtype: float64
splits:
- name: train
num_bytes: 406730039.0
num_examples: 596
download_size: 406731237
dataset_size: 406730039.0
---
# Dataset Card for "beautiful_interesting_spectacular_photo_futuristic_25000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/wiki_find_passage_train30_eval20_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 55436
num_examples: 80
- name: validation
num_bytes: 15332
num_examples: 20
download_size: 41704
dataset_size: 70768
---
# Dataset Card for "wiki_find_passage_train30_eval20_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
herutriana44/QA_MSIB | ---
license: apache-2.0
---
|
ikno/rinko_couple_v2.13 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: situation
dtype: string
- name: len_messages
dtype: int64
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: relation
dtype: string
- name: fact
dtype: string
- name: post
dtype: string
splits:
- name: train
num_bytes: 29058470.0
num_examples: 5850
- name: test
num_bytes: 2638512
num_examples: 442
download_size: 17688580
dataset_size: 31696982.0
---
# Dataset Card for "rinko_couple_v2.13"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alvarobartt/ultrafeedback-instruction-dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: generations
sequence: string
- name: raw_generation_response
sequence: string
- name: rating
sequence: int64
- name: rationale
sequence: string
- name: raw_labelling_response
struct:
- name: choices
list:
- name: finish_reason
dtype: string
- name: index
dtype: int64
- name: message
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: created
dtype: int64
- name: id
dtype: string
- name: model
dtype: string
- name: object
dtype: string
- name: usage
struct:
- name: completion_tokens
dtype: int64
- name: prompt_tokens
dtype: int64
- name: total_tokens
dtype: int64
splits:
- name: train
num_bytes: 167493
num_examples: 50
download_size: 98372
dataset_size: 167493
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ultrafeedback-instruction-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/chip2_instruct_alpha_prompt_ru | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 120371757
num_examples: 162087
download_size: 58859759
dataset_size: 120371757
---
# Dataset Card for "chip2_instruct_alpha_prompt_ru"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GenVRadmin/Samvaad-Punjabi-Mini | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.