datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
zaursamedov1/OpenAI-cookbook | ---
license: wtfpl
---
|
autoevaluate/autoeval-eval-cuad-default-2fec59-2004766522 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cuad
eval_info:
task: extractive_question_answering
model: 123tarunanand/roberta-base-finetuned
metrics: ['recall']
dataset_name: cuad
dataset_config: default
dataset_split: test
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: 123tarunanand/roberta-base-finetuned
* Dataset: cuad
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@adrienheymans](https://huggingface.co/adrienheymans) for evaluating this model. |
huggingartists/aimer | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/aimer"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.237926 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/123a0b2ef09a25207b610c5bd7b21d0f.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/aimer">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Aimer</div>
<a href="https://genius.com/artists/aimer">
<div style="text-align: center; font-size: 14px;">@aimer</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/aimer).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/aimer")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|171| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/aimer")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
vhtran/en-id | ---
license: cc-by-4.0
task_categories:
- translation
language:
- en
- id
pretty_name: enidlrmt
--- |
FanChen0116/few_32_empty | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 5225
num_examples: 32
- name: validation
num_bytes: 4861
num_examples: 32
- name: test
num_bytes: 5405
num_examples: 32
download_size: 11658
dataset_size: 15491
---
# Dataset Card for "few_32_empty"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iamkaikai/MATISSEE-ART | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6679969.0
num_examples: 269
download_size: 6585569
dataset_size: 6679969.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "MATISSEE-ART"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sinsforeal/yuyu | ---
license: openrail
---
|
dtadpole/sharegpt-20230401 | ---
license: mit
---
|
lcampillos/ctebmsp | ---
license: cc-by-4.0
language:
- es
multilinguality:
- monolingual
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name:
- CT-EBM-SP
---
# CT-EBM-SP (Clinical Trials for Evidence-based Medicine in Spanish)
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://www.lllf.uam.es/ESP/nlpmedterm_en.html
- **Repository:** http://www.lllf.uam.es/ESP/nlpdata/wp2/CT-EBM-SP.zip
- **Paper:** Campillos-Llanos, L., Valverde-Mateos, A., Capllonch-Carrión, A., & Moreno-Sandoval, A. (2021). A clinical trials corpus annotated with UMLS entities to enhance the access to evidence-based medicine. BMC medical informatics and decision making, 21(1), 1-19
- **Point of Contact:** leonardo.campillos AT gmail.com
### Dataset Summary
The [Clinical Trials for Evidence-Based-Medicine in Spanish corpus](http://www.lllf.uam.es/ESP/nlpdata/wp2/) is a collection of 1200 texts about clinical trials studies and clinical trials announcements:
- 500 abstracts from journals published under a Creative Commons license, e.g. available in PubMed or the Scientific Electronic Library Online (SciELO)
- 700 clinical trials announcements published in the European Clinical Trials Register and Repositorio Español de Estudios Clínicos
If you use the CT-EBM-SP resource, please, cite as follows:
```
@article{campillosetal-midm2021,
title = {A clinical trials corpus annotated with UMLS© entities to enhance the access to Evidence-Based Medicine},
author = {Campillos-Llanos, Leonardo and Valverde-Mateos, Ana and Capllonch-Carri{\'o}n, Adri{\'a}n and Moreno-Sandoval, Antonio},
journal = {BMC Medical Informatics and Decision Making},
volume={21},
number={1},
pages={1--19},
year={2021},
publisher={BioMed Central}
}
```
### Supported Tasks
Medical Named Entity Recognition
### Languages
Spanish
## Dataset Structure
### Data Instances
- 292 173 tokens
- 46 699 entities of the following [Unified Medical Language System (UMLS)](https://www.nlm.nih.gov/research/umls/index.html) semantic groups:
- ANAT (anatomy and body parts): 6728 entities
- CHEM (chemical and pharmacological substances): 9224 entities
- DISO (pathologic conditions): 13 067 entities
- PROC (therapeutic and diagnostic procedures, and laboratory analyses): 17 680 entities
### Data Splits
- Train: 175 203 tokens, 28 101 entities
- Development: 58 670 tokens, 9629 entities
- Test: 58 300 tokens, 8969 entities
## Dataset Creation
### Source Data
- Abstracts from journals published under a Creative Commons license, available in [PubMed](https://pubmed.ncbi.nlm.nih.gov/) or the [Scientific Electronic Library Online (SciELO)](https://scielo.org/es/)
- Clinical trials announcements published in the [European Clinical Trials Register](https://www.clinicaltrialsregister.eu) and [Repositorio Español de Estudios Clínicos](https://reec.aemps.es)
### Annotations
#### Who are the annotators?
- Leonardo Campillos-Llanos, Computational Linguist, Consejo Superior de Investigaciones Científicas
- Adrián Capllonch-Carrión, Medical Doctor, Centro de Salud Retiro, Hospital Universitario Gregorio Marañón
- Ana Valverde-Mateos, Medical Lexicographer, Spanish Royal Academy of Medicine
## Considerations for Using the Data
**Disclosure**: This dataset is under development and needs to be improved. It should not be used for medical decision making without human assistance and supervision.
This resource is intended for a generalist purpose, and may have bias and/or any other undesirable distortions.
The owner or creator of the models will in no event be liable for any results arising from the use made by third parties of this dataset.
**Descargo de responsabilidad**: Este conjunto de datos se encuentra en desarrollo y no debe ser empleada para la toma de decisiones médicas
La finalidad de este modelo es generalista, y puede tener sesgos y/u otro tipo de distorsiones indeseables.
El propietario o creador de los modelos de ningún modo será responsable de los resultados derivados del uso que las terceras partes hagan de estos datos. |
maidalun1020/CrosslingualRetrievalFinanceEn2Zh-qrels | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
dataset_info:
features:
- name: qid
dtype: string
- name: pid
dtype: string
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 620752
num_examples: 25525
download_size: 332079
dataset_size: 620752
---
|
lhallee/ssq3 | ---
dataset_info:
features:
- name: seqs
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 5373910
num_examples: 10792
- name: valid
num_bytes: 331482
num_examples: 626
- name: test
num_bytes: 22594
num_examples: 50
download_size: 3780271
dataset_size: 5727986
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
tellarin-ai/ntx_llm_inst_korean | ---
license: cc-by-sa-4.0
language:
- ko
task_categories:
- token-classification
---
# Dataset Card for NTX v1 in the Aya format - Korean subset
This dataset is a format conversion for the Korean data from the original NTX into the Aya instruction format and it's released here under the CC-BY-SA 4.0 license.
## Dataset Details
For the original NTX dataset, the conversion to the Aya instructions format, or more details, please refer to the full dataset in instruction form (https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions) or to the paper below.
**NOTE: ** Unfortunately, due to a conversion issue with numerical expressions, this version here only includes the temporal expressions part of NTX.
## Citation
If you utilize this dataset version, feel free to cite/footnote the complete version at https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions, but please also cite the *original dataset publication*.
**BibTeX:**
```
@preprint{chen2023dataset,
title={Dataset and Baseline System for Multi-lingual Extraction and Normalization of Temporal and Numerical Expressions},
author={Sanxing Chen and Yongqiang Chen and Börje F. Karlsson},
year={2023},
eprint={2303.18103},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
yangyz1230/splice_sites_acceptors | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 829804
num_examples: 1277
- name: test
num_bytes: 98616
num_examples: 152
download_size: 451695
dataset_size: 928420
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_ALBADDAWI__DeepCode-7B-Aurora-v2 | ---
pretty_name: Evaluation run of ALBADDAWI/DeepCode-7B-Aurora-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ALBADDAWI/DeepCode-7B-Aurora-v2](https://huggingface.co/ALBADDAWI/DeepCode-7B-Aurora-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ALBADDAWI__DeepCode-7B-Aurora-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T18:14:07.682460](https://huggingface.co/datasets/open-llm-leaderboard/details_ALBADDAWI__DeepCode-7B-Aurora-v2/blob/main/results_2024-04-10T18-14-07.682460.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5629600724652444,\n\
\ \"acc_stderr\": 0.034437940352628316,\n \"acc_norm\": 0.570143603312895,\n\
\ \"acc_norm_stderr\": 0.03516996401091805,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4051174166535907,\n\
\ \"mc2_stderr\": 0.015044532390937759\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5136518771331058,\n \"acc_stderr\": 0.014605943429860947,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5436168094005178,\n\
\ \"acc_stderr\": 0.004970759774676881,\n \"acc_norm\": 0.7209719179446326,\n\
\ \"acc_norm_stderr\": 0.004476047101806569\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.03050329201334259,\n\
\ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.03050329201334259\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6468085106382979,\n \"acc_stderr\": 0.031245325202761926,\n\
\ \"acc_norm\": 0.6468085106382979,\n \"acc_norm_stderr\": 0.031245325202761926\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5687830687830688,\n \"acc_stderr\": 0.0255064816981382,\n \"acc_norm\"\
: 0.5687830687830688,\n \"acc_norm_stderr\": 0.0255064816981382\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6387096774193548,\n \"acc_stderr\": 0.02732754844795754,\n \"\
acc_norm\": 0.6387096774193548,\n \"acc_norm_stderr\": 0.02732754844795754\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091707,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178815,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.03355397369686172,\n\
\ \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.03355397369686172\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.024985354923102335,\n\
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.024985354923102335\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4111111111111111,\n \"acc_stderr\": 0.02999992350870668,\n \
\ \"acc_norm\": 0.4111111111111111,\n \"acc_norm_stderr\": 0.02999992350870668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136094,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136094\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634611,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634611\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.744954128440367,\n \"acc_stderr\": 0.018688500856535825,\n \"\
acc_norm\": 0.744954128440367,\n \"acc_norm_stderr\": 0.018688500856535825\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5588235294117647,\n \"acc_stderr\": 0.034849415144292316,\n \"\
acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.034849415144292316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6708860759493671,\n \"acc_stderr\": 0.030587326294702358,\n \
\ \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.030587326294702358\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335435,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335435\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6819923371647509,\n\
\ \"acc_stderr\": 0.016653486275615387,\n \"acc_norm\": 0.6819923371647509,\n\
\ \"acc_norm_stderr\": 0.016653486275615387\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n\
\ \"acc_stderr\": 0.015566392630057027,\n \"acc_norm\": 0.31731843575418994,\n\
\ \"acc_norm_stderr\": 0.015566392630057027\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617156,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596157,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596157\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3852672750977836,\n\
\ \"acc_stderr\": 0.01242948543495521,\n \"acc_norm\": 0.3852672750977836,\n\
\ \"acc_norm_stderr\": 0.01242948543495521\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.02967428828131118,\n\
\ \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.02967428828131118\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.03029950656215418,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.03029950656215418\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5847953216374269,\n \"acc_stderr\": 0.03779275945503201,\n\
\ \"acc_norm\": 0.5847953216374269,\n \"acc_norm_stderr\": 0.03779275945503201\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4051174166535907,\n\
\ \"mc2_stderr\": 0.015044532390937759\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6708760852407262,\n \"acc_stderr\": 0.01320638708909147\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2486732373009856,\n \
\ \"acc_stderr\": 0.011906147222879979\n }\n}\n```"
repo_url: https://huggingface.co/ALBADDAWI/DeepCode-7B-Aurora-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|arc:challenge|25_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|arc:challenge|25_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|gsm8k|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|gsm8k|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hellaswag|10_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hellaswag|10_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T15-20-01.652451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T18-14-07.682460.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T18-14-07.682460.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- '**/details_harness|winogrande|5_2024-04-10T15-20-01.652451.parquet'
- split: 2024_04_10T18_14_07.682460
path:
- '**/details_harness|winogrande|5_2024-04-10T18-14-07.682460.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T18-14-07.682460.parquet'
- config_name: results
data_files:
- split: 2024_04_10T15_20_01.652451
path:
- results_2024-04-10T15-20-01.652451.parquet
- split: 2024_04_10T18_14_07.682460
path:
- results_2024-04-10T18-14-07.682460.parquet
- split: latest
path:
- results_2024-04-10T18-14-07.682460.parquet
---
# Dataset Card for Evaluation run of ALBADDAWI/DeepCode-7B-Aurora-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ALBADDAWI/DeepCode-7B-Aurora-v2](https://huggingface.co/ALBADDAWI/DeepCode-7B-Aurora-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ALBADDAWI__DeepCode-7B-Aurora-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T18:14:07.682460](https://huggingface.co/datasets/open-llm-leaderboard/details_ALBADDAWI__DeepCode-7B-Aurora-v2/blob/main/results_2024-04-10T18-14-07.682460.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5629600724652444,
"acc_stderr": 0.034437940352628316,
"acc_norm": 0.570143603312895,
"acc_norm_stderr": 0.03516996401091805,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4051174166535907,
"mc2_stderr": 0.015044532390937759
},
"harness|arc:challenge|25": {
"acc": 0.5136518771331058,
"acc_stderr": 0.014605943429860947,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.5436168094005178,
"acc_stderr": 0.004970759774676881,
"acc_norm": 0.7209719179446326,
"acc_norm_stderr": 0.004476047101806569
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.03050329201334259,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.03050329201334259
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6468085106382979,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.6468085106382979,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5687830687830688,
"acc_stderr": 0.0255064816981382,
"acc_norm": 0.5687830687830688,
"acc_norm_stderr": 0.0255064816981382
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091707,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.03355397369686172,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.03355397369686172
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.024985354923102335,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.024985354923102335
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4111111111111111,
"acc_stderr": 0.02999992350870668,
"acc_norm": 0.4111111111111111,
"acc_norm_stderr": 0.02999992350870668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136094,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136094
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634611,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634611
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.744954128440367,
"acc_stderr": 0.018688500856535825,
"acc_norm": 0.744954128440367,
"acc_norm_stderr": 0.018688500856535825
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.034849415144292316,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.034849415144292316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.030587326294702358,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.030587326294702358
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335435,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335435
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6819923371647509,
"acc_stderr": 0.016653486275615387,
"acc_norm": 0.6819923371647509,
"acc_norm_stderr": 0.016653486275615387
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.015566392630057027,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.015566392630057027
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596157,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596157
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3852672750977836,
"acc_stderr": 0.01242948543495521,
"acc_norm": 0.3852672750977836,
"acc_norm_stderr": 0.01242948543495521
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.02967428828131118,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.02967428828131118
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.03029950656215418,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.03029950656215418
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5847953216374269,
"acc_stderr": 0.03779275945503201,
"acc_norm": 0.5847953216374269,
"acc_norm_stderr": 0.03779275945503201
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4051174166535907,
"mc2_stderr": 0.015044532390937759
},
"harness|winogrande|5": {
"acc": 0.6708760852407262,
"acc_stderr": 0.01320638708909147
},
"harness|gsm8k|5": {
"acc": 0.2486732373009856,
"acc_stderr": 0.011906147222879979
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Falah/pixarstyle_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 45415433
num_examples: 100000
download_size: 5581919
dataset_size: 45415433
---
# Dataset Card for "pixarstyle_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
firstgradeai/ytrends5 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 13395866.378974993
num_examples: 9069
- name: test
num_bytes: 5741507.621025008
num_examples: 3887
download_size: 9624751
dataset_size: 19137374.0
---
# Dataset Card for "ytrends5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DZN222/teste21 | ---
license: openrail
---
|
louisbrulenaudet/lpf | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- tax
- llm
- fiscal
- lpf
- Livre des procédures fiscales
source_datasets:
- original
pretty_name: Livre des procédures fiscales (LPF)
task_categories:
- text-generation
- table-question-answering
- summarization
- conversational
size_categories:
- n<1K
---
# Livre des procédures fiscales, non-instruct (11-12-2023)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for tax practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Citing this project
If you use this code in your research, please use the following BibTeX entry.
```BibTeX
@misc{louisbrulenaudet2023,
author = {Louis Brulé Naudet},
title = {Livre des procédures fiscales, non-instruct (11-12-2023)},
howpublished = {\url{https://huggingface.co/datasets/louisbrulenaudet/lpf}},
year = {2023}
}
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
ibizagrowthagency/train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Aquarell Tattoos
'1': Bedeutung der Tribal Tattoos
'2': Blackwork Tattoo
'3': Building
'4': Cover-Up Tattoo
'5': Dotwork Tattoos
'6': Fineline Tattoos
'7': Geschiche der Maori Tattoos
'8': Japanische Tattoos in Leipzig
'9': Narben Tattoo
'10': Portrait Tattoos
'11': Poster
'12': Realistic Tattoos
'13': Totenkopf Tattoos
'14': Trashpolka Tattoos
'15': Tribal Tattoo
'16': Wikinger Tattoos
splits:
- name: train
num_bytes: 6665820.160194174
num_examples: 175
- name: test
num_bytes: 1297030.8398058251
num_examples: 31
download_size: 7953806
dataset_size: 7962851.0
---
# Dataset Card for "train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ksei/trial_data | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 60706713.0
num_examples: 439
download_size: 60524618
dataset_size: 60706713.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tak15/nva-decapitation230629 | ---
tags:
- not-for-all-audiences
--- |
Fcabb/coringaak | ---
license: openrail
---
|
ZhongshengWang/Alpaca-cnn-dailymail | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- summarization
- text-generation
task_ids: []
paperswithcode_id: cnn-daily-mail-1
pretty_name: CNN / Daily Mail
tags:
- conditional-text-generation
---
## Data Summary
Data set Alpaca-cnn-dailymail is a data set version format changed by [ccdv/cnn_dailymail](https://huggingface.co/datasets/ccdv/cnn_dailymail) to meet Alpaca fine-tuning Llama2. Only versions 3.0.0 and 2.0.0 were used for merging and as a key data set for the summary extraction task.
## Licensing Information
The Alpaca-cnn-dailymail dataset version 1.0.0 is released under the Apache-2.0 License.
## Citation Information
```
@inproceedings{see-etal-2017-get,
title = "Get To The Point: Summarization with Pointer-Generator Networks",
author = "See, Abigail and
Liu, Peter J. and
Manning, Christopher D.",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P17-1099",
doi = "10.18653/v1/P17-1099",
pages = "1073--1083",
abstract = "Neural sequence-to-sequence models have provided a viable new approach for abstractive text summarization (meaning they are not restricted to simply selecting and rearranging passages from the original text). However, these models have two shortcomings: they are liable to reproduce factual details inaccurately, and they tend to repeat themselves. In this work we propose a novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways. First, we use a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator. Second, we use coverage to keep track of what has been summarized, which discourages repetition. We apply our model to the CNN / Daily Mail summarization task, outperforming the current abstractive state-of-the-art by at least 2 ROUGE points.",
}
```
```
@inproceedings{DBLP:conf/nips/HermannKGEKSB15,
author={Karl Moritz Hermann and Tomás Kociský and Edward Grefenstette and Lasse Espeholt and Will Kay and Mustafa Suleyman and Phil Blunsom},
title={Teaching Machines to Read and Comprehend},
year={2015},
cdate={1420070400000},
pages={1693-1701},
url={http://papers.nips.cc/paper/5945-teaching-machines-to-read-and-comprehend},
booktitle={NIPS},
crossref={conf/nips/2015}
}
```
|
streamerbtw1002/physics-50KB | ---
license: apache-2.0
language:
- en
size_categories:
- 10K<n<100K
--- |
kye/all-lucidrain-code-python-tokenized-8192-4 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 170959464
num_examples: 4173
download_size: 39435682
dataset_size: 170959464
---
# Dataset Card for "all-lucidrain-code-python-tokenized-8192-4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
madhaviit/corybooker_comments | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: post_id
dtype: string
- name: comment_id
dtype: int64
- name: comment_url
dtype: string
- name: commenter_id
dtype: int64
- name: commenter_name
dtype: string
- name: comment_text
dtype: string
- name: comment_time
dtype: string
- name: comment_image
dtype: string
- name: comment_reactors
dtype: string
- name: spam
dtype: string
- name: hate
dtype: string
splits:
- name: train
num_bytes: 2520157
num_examples: 5300
download_size: 896481
dataset_size: 2520157
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
brianhoel/fgsadfg | ---
pretty_name: dsgfsdfg
--- |
mask-distilled-one-sec-cv12/chunk_161 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1150710528
num_examples: 225984
download_size: 1173677880
dataset_size: 1150710528
---
# Dataset Card for "chunk_161"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vibhorag101/phr_mental_therapy_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 458762343
num_examples: 99086
download_size: 211247054
dataset_size: 458762343
license: mit
task_categories:
- text-generation
language:
- en
tags:
- medical
pretty_name: Synthetic Mental Therapy Dataset
size_categories:
- 10K<n<100K
---
# Dataset Card for "phr_mental_health_dataset"
- This dataset is a cleaned version of [nart-100k-synthetic](https://huggingface.co/datasets/jerryjalapeno/nart-100k-synthetic)
- The data is generated synthetically using gpt3.5-turbo using [this](https://github.com/jerryjalapeno/nart-100k-7b/blob/main/synthetic_conv_gen.py) script.
- The dataset had a "sharegpt" style JSONL format, with each JSON having keys "human" and "gpt", having an equal number of both.
- The data was then cleaned, and the following changes were made
- The names "Alex" and "Charlie" were removed from the dataset, which can often come up in the conversation of fine-tuned models.
- The data was then converted to the format required for llama-2-chat models.
- The dataset was converted to JSONL format with just a single key, "text", which contains the combined text for training the model.
- The appropriate llama-2 system prompt was added at the beginning of the conversation.
- The conversation was then enclosed with [INST], [\INST], `<s> and </s>` formats as defined in [llama-2](https://huggingface.co/blog/llama2#:~:text=Using%20text-generation-inference%20and%20Inference%20Endpoints&text=You%20can%20try%20out%20Text,Deploy%20-%3E%20Inference%20Endpoints%20widget.) article.
- Whether to include the last conversation, i.e., the last GPT response or not, was chosen randomly.
|
HanxuHU/Multi_MMMU | ---
dataset_info:
config_name: Accounting
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
struct:
- name: bytes
dtype: binary
- name: path
dtype: string
- name: image_2
dtype: 'null'
- name: image_3
dtype: 'null'
- name: image_4
dtype: 'null'
- name: image_5
dtype: 'null'
- name: image_6
dtype: 'null'
- name: image_7
dtype: 'null'
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1598548
num_examples: 30
download_size: 1533719
dataset_size: 1598548
configs:
- config_name: Accounting
data_files:
- split: validation
path: Accounting/validation-*
---
|
lvdthieu/solfile-v2 | ---
license: mit
---
|
CyberHarem/kirov_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kirov/キーロフ/基洛夫 (Azur Lane)
This is the dataset of kirov/キーロフ/基洛夫 (Azur Lane), containing 33 images and their tags.
The core tags of this character are `long_hair, breasts, very_long_hair, white_hair, yellow_eyes, large_breasts, ponytail, bangs, grey_hair, huge_breasts, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 33 | 55.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirov_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 33 | 27.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirov_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 71 | 51.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirov_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 33 | 45.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirov_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 71 | 79.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirov_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kirov_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, black_shorts, hair_ornament, holding_cup, smile, barefoot, braid, long_sleeves, off_shoulder, official_alternate_costume, short_shorts, sideboob, white_shirt, black_ribbon, closed_mouth, hair_ribbon, open_clothes, ankle_ribbon, from_side, full_body, indoors, plant, standing, swept_bangs |
| 1 | 8 |  |  |  |  |  | 1girl, solo, cleavage, fur_trim, looking_at_viewer, white_headwear, pantyhose, smile, black_gloves, military_hat, black_necktie, blue_skirt, cape, pleated_skirt, simple_background, white_background, holding_weapon, standing, sword |
| 2 | 12 |  |  |  |  |  | 1girl, blush, sweat, navel, solo, completely_nude, nipples, open_mouth, thighs, heart, hetero, sex |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | bare_shoulders | black_shorts | hair_ornament | holding_cup | smile | barefoot | braid | long_sleeves | off_shoulder | official_alternate_costume | short_shorts | sideboob | white_shirt | black_ribbon | closed_mouth | hair_ribbon | open_clothes | ankle_ribbon | from_side | full_body | indoors | plant | standing | swept_bangs | cleavage | fur_trim | white_headwear | pantyhose | black_gloves | military_hat | black_necktie | blue_skirt | cape | pleated_skirt | simple_background | white_background | holding_weapon | sword | blush | sweat | navel | completely_nude | nipples | open_mouth | thighs | heart | hetero | sex |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------|:---------------|:----------------|:--------------|:--------|:-----------|:--------|:---------------|:---------------|:-----------------------------|:---------------|:-----------|:--------------|:---------------|:---------------|:--------------|:---------------|:---------------|:------------|:------------|:----------|:--------|:-----------|:--------------|:-----------|:-----------|:-----------------|:------------|:---------------|:---------------|:----------------|:-------------|:-------|:----------------|:--------------------|:-------------------|:-----------------|:--------|:--------|:--------|:--------|:------------------|:----------|:-------------|:---------|:--------|:---------|:------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
ChanceFocus/flare-finarg-ecc-auc | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: test
num_bytes: 549300
num_examples: 969
download_size: 177802
dataset_size: 549300
---
# Dataset Card for "flare-finarg-ecc-auc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/leona_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of leona (League of Legends)
This is the dataset of leona (League of Legends), containing 157 images and their tags.
The core tags of this character are `long_hair, breasts, brown_hair, large_breasts, brown_eyes, lips`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 157 | 149.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 157 | 100.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 301 | 182.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 157 | 136.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 301 | 235.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/leona_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, solo, sword, shield, ear_protection, armored_dress, breastplate, gauntlets, holding |
| 1 | 16 |  |  |  |  |  | 1girl, hetero, penis, solo_focus, 1boy, sex, nude, uncensored, nipples, open_mouth, vaginal, cum_in_pussy, blush, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | sword | shield | ear_protection | armored_dress | breastplate | gauntlets | holding | hetero | penis | solo_focus | 1boy | sex | nude | uncensored | nipples | open_mouth | vaginal | cum_in_pussy | blush | navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:---------|:-----------------|:----------------|:--------------|:------------|:----------|:---------|:--------|:-------------|:-------|:------|:-------|:-------------|:----------|:-------------|:----------|:---------------|:--------|:--------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-06T15:59:49.029647](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down_public/blob/main/results_2023-11-06T15-59-49.029647.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3382969798657718,\n\
\ \"em_stderr\": 0.004845295517321938,\n \"f1\": 0.377463296979866,\n\
\ \"f1_stderr\": 0.004772531415054459,\n \"acc\": 0.44698214966373917,\n\
\ \"acc_stderr\": 0.010405035391715039\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3382969798657718,\n \"em_stderr\": 0.004845295517321938,\n\
\ \"f1\": 0.377463296979866,\n \"f1_stderr\": 0.004772531415054459\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12206216830932524,\n \
\ \"acc_stderr\": 0.009017054965766493\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663583\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_02T21_44_42.367219
path:
- '**/details_harness|drop|3_2023-11-02T21-44-42.367219.parquet'
- split: 2023_11_05T00_12_34.363796
path:
- '**/details_harness|drop|3_2023-11-05T00-12-34.363796.parquet'
- split: 2023_11_06T15_59_49.029647
path:
- '**/details_harness|drop|3_2023-11-06T15-59-49.029647.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-06T15-59-49.029647.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_02T21_44_42.367219
path:
- '**/details_harness|gsm8k|5_2023-11-02T21-44-42.367219.parquet'
- split: 2023_11_05T00_12_34.363796
path:
- '**/details_harness|gsm8k|5_2023-11-05T00-12-34.363796.parquet'
- split: 2023_11_06T15_59_49.029647
path:
- '**/details_harness|gsm8k|5_2023-11-06T15-59-49.029647.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-06T15-59-49.029647.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_02T21_44_42.367219
path:
- '**/details_harness|winogrande|5_2023-11-02T21-44-42.367219.parquet'
- split: 2023_11_05T00_12_34.363796
path:
- '**/details_harness|winogrande|5_2023-11-05T00-12-34.363796.parquet'
- split: 2023_11_06T15_59_49.029647
path:
- '**/details_harness|winogrande|5_2023-11-06T15-59-49.029647.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-06T15-59-49.029647.parquet'
- config_name: results
data_files:
- split: 2023_11_02T21_44_42.367219
path:
- results_2023-11-02T21-44-42.367219.parquet
- split: 2023_11_05T00_12_34.363796
path:
- results_2023-11-05T00-12-34.363796.parquet
- split: 2023_11_06T15_59_49.029647
path:
- results_2023-11-06T15-59-49.029647.parquet
- split: latest
path:
- results_2023-11-06T15-59-49.029647.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-06T15:59:49.029647](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down_public/blob/main/results_2023-11-06T15-59-49.029647.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3382969798657718,
"em_stderr": 0.004845295517321938,
"f1": 0.377463296979866,
"f1_stderr": 0.004772531415054459,
"acc": 0.44698214966373917,
"acc_stderr": 0.010405035391715039
},
"harness|drop|3": {
"em": 0.3382969798657718,
"em_stderr": 0.004845295517321938,
"f1": 0.377463296979866,
"f1_stderr": 0.004772531415054459
},
"harness|gsm8k|5": {
"acc": 0.12206216830932524,
"acc_stderr": 0.009017054965766493
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663583
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mask-distilled-libri-one-sec-cv12/chunk_8 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: logits
sequence: float32
splits:
- name: train
num_bytes: 234448381.12916
num_examples: 7313
download_size: 180648241
dataset_size: 234448381.12916
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-kmfoda__booksum-e703e34d-10975474 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-book-summary
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-book-summary
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
Kamaljp/amazon_us_3000 | ---
dataset_info:
features:
- name: marketplace
dtype: string
- name: customer_id
dtype: string
- name: review_id
dtype: string
- name: product_id
dtype: string
- name: product_parent
dtype: string
- name: product_title
dtype: string
- name: product_category
dtype: string
- name: star_rating
dtype: int32
- name: helpful_votes
dtype: int32
- name: total_votes
dtype: int32
- name: vine
dtype:
class_label:
names:
'0': N
'1': Y
- name: verified_purchase
dtype:
class_label:
names:
'0': N
'1': Y
- name: review_headline
dtype: string
- name: review_body
dtype: string
- name: review_date
dtype: string
splits:
- name: train
num_bytes: 1391025
num_examples: 3000
download_size: 763643
dataset_size: 1391025
---
# Dataset Card for "amazon_us_3000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JovialValley/syllable_totalMapped3 | ---
dataset_info:
features:
- name: input_values
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 108599016
num_examples: 390
- name: test
num_bytes: 26977548
num_examples: 97
download_size: 136574643
dataset_size: 135576564
---
# Dataset Card for "syllable_totalMapped3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Chickaboo__ChickaQ | ---
pretty_name: Evaluation run of Chickaboo/ChickaQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Chickaboo/ChickaQ](https://huggingface.co/Chickaboo/ChickaQ) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Chickaboo__ChickaQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T14:10:54.618600](https://huggingface.co/datasets/open-llm-leaderboard/details_Chickaboo__ChickaQ/blob/main/results_2024-03-21T14-10-54.618600.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3659848060809438,\n\
\ \"acc_stderr\": 0.03373302007951669,\n \"acc_norm\": 0.37124839114399955,\n\
\ \"acc_norm_stderr\": 0.03461636251212984,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602588,\n \"mc2\": 0.47219104025186426,\n\
\ \"mc2_stderr\": 0.016351942852493542\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.25341296928327645,\n \"acc_stderr\": 0.012710896778378606,\n\
\ \"acc_norm\": 0.29436860068259385,\n \"acc_norm_stderr\": 0.013318528460539426\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3866759609639514,\n\
\ \"acc_stderr\": 0.004859930926500309,\n \"acc_norm\": 0.49153555068711413,\n\
\ \"acc_norm_stderr\": 0.004989066355449555\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.36981132075471695,\n \"acc_stderr\": 0.02971142188010793,\n\
\ \"acc_norm\": 0.36981132075471695,\n \"acc_norm_stderr\": 0.02971142188010793\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.34104046242774566,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.02951319662553935,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.02951319662553935\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3709677419354839,\n\
\ \"acc_stderr\": 0.02748054188795359,\n \"acc_norm\": 0.3709677419354839,\n\
\ \"acc_norm_stderr\": 0.02748054188795359\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.03308530426228257,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.03308530426228257\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.03895658065271846,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.03895658065271846\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5202020202020202,\n \"acc_stderr\": 0.035594435655639176,\n \"\
acc_norm\": 0.5202020202020202,\n \"acc_norm_stderr\": 0.035594435655639176\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47150259067357514,\n \"acc_stderr\": 0.036025735712884414,\n\
\ \"acc_norm\": 0.47150259067357514,\n \"acc_norm_stderr\": 0.036025735712884414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.023234581088428494,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.023234581088428494\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136077,\n\
\ \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136077\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360383,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360383\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.41467889908256883,\n \"acc_stderr\": 0.021122903208602602,\n \"\
acc_norm\": 0.41467889908256883,\n \"acc_norm_stderr\": 0.021122903208602602\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.034602283272391704,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.034602283272391704\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4388185654008439,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.4388185654008439,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3893129770992366,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.3893129770992366,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3558282208588957,\n \"acc_stderr\": 0.03761521380046735,\n\
\ \"acc_norm\": 0.3558282208588957,\n \"acc_norm_stderr\": 0.03761521380046735\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.04950504382128919,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.04950504382128919\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n\
\ \"acc_stderr\": 0.03240847393516327,\n \"acc_norm\": 0.5726495726495726,\n\
\ \"acc_norm_stderr\": 0.03240847393516327\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.44699872286079184,\n\
\ \"acc_stderr\": 0.017779225233394216,\n \"acc_norm\": 0.44699872286079184,\n\
\ \"acc_norm_stderr\": 0.017779225233394216\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.02622615860512465,\n\
\ \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.02622615860512465\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961455,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961455\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.028358956313423545,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.028358956313423545\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.34726688102893893,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.34726688102893893,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.027125115513166858,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.027125115513166858\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503793,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503793\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30834419817470665,\n\
\ \"acc_stderr\": 0.011794833789715327,\n \"acc_norm\": 0.30834419817470665,\n\
\ \"acc_norm_stderr\": 0.011794833789715327\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2977941176470588,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.2977941176470588,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3627450980392157,\n \"acc_stderr\": 0.019450768432505514,\n \
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.019450768432505514\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40816326530612246,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.40816326530612246,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n\
\ \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n\
\ \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.036996580176568775,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.036996580176568775\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602588,\n \"mc2\": 0.47219104025186426,\n\
\ \"mc2_stderr\": 0.016351942852493542\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5611681136543015,\n \"acc_stderr\": 0.013946933444507032\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \
\ \"acc_stderr\": 0.0021386703014604795\n }\n}\n```"
repo_url: https://huggingface.co/Chickaboo/ChickaQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-10-54.618600.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-10-54.618600.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- '**/details_harness|winogrande|5_2024-03-21T14-10-54.618600.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T14-10-54.618600.parquet'
- config_name: results
data_files:
- split: 2024_03_21T14_10_54.618600
path:
- results_2024-03-21T14-10-54.618600.parquet
- split: latest
path:
- results_2024-03-21T14-10-54.618600.parquet
---
# Dataset Card for Evaluation run of Chickaboo/ChickaQ
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Chickaboo/ChickaQ](https://huggingface.co/Chickaboo/ChickaQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Chickaboo__ChickaQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T14:10:54.618600](https://huggingface.co/datasets/open-llm-leaderboard/details_Chickaboo__ChickaQ/blob/main/results_2024-03-21T14-10-54.618600.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3659848060809438,
"acc_stderr": 0.03373302007951669,
"acc_norm": 0.37124839114399955,
"acc_norm_stderr": 0.03461636251212984,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602588,
"mc2": 0.47219104025186426,
"mc2_stderr": 0.016351942852493542
},
"harness|arc:challenge|25": {
"acc": 0.25341296928327645,
"acc_stderr": 0.012710896778378606,
"acc_norm": 0.29436860068259385,
"acc_norm_stderr": 0.013318528460539426
},
"harness|hellaswag|10": {
"acc": 0.3866759609639514,
"acc_stderr": 0.004859930926500309,
"acc_norm": 0.49153555068711413,
"acc_norm_stderr": 0.004989066355449555
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.36981132075471695,
"acc_stderr": 0.02971142188010793,
"acc_norm": 0.36981132075471695,
"acc_norm_stderr": 0.02971142188010793
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.02951319662553935,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.02951319662553935
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633345,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633345
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3709677419354839,
"acc_stderr": 0.02748054188795359,
"acc_norm": 0.3709677419354839,
"acc_norm_stderr": 0.02748054188795359
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.03308530426228257,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.03308530426228257
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.03895658065271846,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.03895658065271846
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5202020202020202,
"acc_stderr": 0.035594435655639176,
"acc_norm": 0.5202020202020202,
"acc_norm_stderr": 0.035594435655639176
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47150259067357514,
"acc_stderr": 0.036025735712884414,
"acc_norm": 0.47150259067357514,
"acc_norm_stderr": 0.036025735712884414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3,
"acc_stderr": 0.023234581088428494,
"acc_norm": 0.3,
"acc_norm_stderr": 0.023234581088428494
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136077,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136077
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360383,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360383
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.41467889908256883,
"acc_stderr": 0.021122903208602602,
"acc_norm": 0.41467889908256883,
"acc_norm_stderr": 0.021122903208602602
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.034602283272391704,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.034602283272391704
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4388185654008439,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.4388185654008439,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3893129770992366,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.3893129770992366,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3558282208588957,
"acc_stderr": 0.03761521380046735,
"acc_norm": 0.3558282208588957,
"acc_norm_stderr": 0.03761521380046735
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.04950504382128919,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.04950504382128919
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.03240847393516327,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.03240847393516327
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.44699872286079184,
"acc_stderr": 0.017779225233394216,
"acc_norm": 0.44699872286079184,
"acc_norm_stderr": 0.017779225233394216
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.02622615860512465,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.02622615860512465
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961455,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961455
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.028358956313423545,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.028358956313423545
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.34726688102893893,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.34726688102893893,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.027125115513166858,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.027125115513166858
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503793,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503793
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30834419817470665,
"acc_stderr": 0.011794833789715327,
"acc_norm": 0.30834419817470665,
"acc_norm_stderr": 0.011794833789715327
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2977941176470588,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.2977941176470588,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.019450768432505514,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.019450768432505514
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.4,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40816326530612246,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.40816326530612246,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.036996580176568775,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.036996580176568775
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602588,
"mc2": 0.47219104025186426,
"mc2_stderr": 0.016351942852493542
},
"harness|winogrande|5": {
"acc": 0.5611681136543015,
"acc_stderr": 0.013946933444507032
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.0021386703014604795
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
varcoder/EqualDistributionDataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Aluminum
'1': Steel
'2': Wood
splits:
- name: train
num_bytes: 2194882689.49
num_examples: 8590
download_size: 244159
dataset_size: 2194882689.49
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bidda/bidda-llama2-211rformated | ---
dataset_info:
features:
- name: Content
dtype: string
splits:
- name: train
num_bytes: 1371789
num_examples: 207
download_size: 590493
dataset_size: 1371789
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DjSteker/Electronica | ---
language:
- es
task_categories:
- question-answering
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 6413
num_examples: 13
download_size: 10575
dataset_size: 6413
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
citrusandfriends/sutd_qa_dataset | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 80345
num_examples: 197
download_size: 39948
dataset_size: 80345
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TinyPixel/k_3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1446995520
num_examples: 840090
download_size: 784964827
dataset_size: 1446995520
---
# Dataset Card for "k_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_perlthoughts__neural-chat-v3-3-8x7b-MoE | ---
pretty_name: Evaluation run of perlthoughts/neural-chat-v3-3-8x7b-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/neural-chat-v3-3-8x7b-MoE](https://huggingface.co/perlthoughts/neural-chat-v3-3-8x7b-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__neural-chat-v3-3-8x7b-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T05:24:06.077139](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__neural-chat-v3-3-8x7b-MoE/blob/main/results_2023-12-18T05-24-06.077139.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6274049176829637,\n\
\ \"acc_stderr\": 0.03275930044853432,\n \"acc_norm\": 0.6268605636213929,\n\
\ \"acc_norm_stderr\": 0.033440518650225654,\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.631965606310606,\n\
\ \"mc2_stderr\": 0.015067807381751251\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.013975454122756562,\n\
\ \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176536\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.666301533559052,\n\
\ \"acc_stderr\": 0.004705697745222153,\n \"acc_norm\": 0.8543118900617407,\n\
\ \"acc_norm_stderr\": 0.003520722505332094\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797612,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797612\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159798,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159798\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n\
\ \"acc_stderr\": 0.02489246917246283,\n \"acc_norm\": 0.7419354838709677,\n\
\ \"acc_norm_stderr\": 0.02489246917246283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908234,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n\
\ \"acc_stderr\": 0.016129271025099867,\n \"acc_norm\": 0.8293577981651377,\n\
\ \"acc_norm_stderr\": 0.016129271025099867\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n\
\ \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.014248873549217575,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.014248873549217575\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.02454761779480383,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.02454761779480383\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n\
\ \"acc_stderr\": 0.016639615236845807,\n \"acc_norm\": 0.45027932960893857,\n\
\ \"acc_norm_stderr\": 0.016639615236845807\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.02641560191438898,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.02641560191438898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729146,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811947,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811947\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215923,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215923\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.631965606310606,\n\
\ \"mc2_stderr\": 0.015067807381751251\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936662\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \
\ \"acc_stderr\": 0.012643544762873358\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/neural-chat-v3-3-8x7b-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|arc:challenge|25_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|gsm8k|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hellaswag|10_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T05-24-06.077139.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T05-24-06.077139.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- '**/details_harness|winogrande|5_2023-12-18T05-24-06.077139.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T05-24-06.077139.parquet'
- config_name: results
data_files:
- split: 2023_12_18T05_24_06.077139
path:
- results_2023-12-18T05-24-06.077139.parquet
- split: latest
path:
- results_2023-12-18T05-24-06.077139.parquet
---
# Dataset Card for Evaluation run of perlthoughts/neural-chat-v3-3-8x7b-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [perlthoughts/neural-chat-v3-3-8x7b-MoE](https://huggingface.co/perlthoughts/neural-chat-v3-3-8x7b-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__neural-chat-v3-3-8x7b-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T05:24:06.077139](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__neural-chat-v3-3-8x7b-MoE/blob/main/results_2023-12-18T05-24-06.077139.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6274049176829637,
"acc_stderr": 0.03275930044853432,
"acc_norm": 0.6268605636213929,
"acc_norm_stderr": 0.033440518650225654,
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.631965606310606,
"mc2_stderr": 0.015067807381751251
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.013975454122756562,
"acc_norm": 0.6663822525597269,
"acc_norm_stderr": 0.013778687054176536
},
"harness|hellaswag|10": {
"acc": 0.666301533559052,
"acc_stderr": 0.004705697745222153,
"acc_norm": 0.8543118900617407,
"acc_norm_stderr": 0.003520722505332094
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797612,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797612
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159798,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159798
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908234,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099867,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099867
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217575,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217575
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.02454761779480383,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.02454761779480383
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45027932960893857,
"acc_stderr": 0.016639615236845807,
"acc_norm": 0.45027932960893857,
"acc_norm_stderr": 0.016639615236845807
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.02641560191438898,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.02641560191438898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729146,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811947,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811947
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215923,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.631965606310606,
"mc2_stderr": 0.015067807381751251
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936662
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ISCA-IUB/AntisemitismOnTwitter | ---
language:
- en
---
# Dataset Card for Dataset on Antisemitism on Twitter/X
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The ISCA project has compiled this dataset using an annotation portal, which was used to label tweets as either antisemitic or non-antisemitic, among other labels. Please note that the annotation was done with live data, including images and the context, such as threads. The original data was sourced from annotationportal.com.
### Languages
English
## Dataset Structure
‘TweetID’: Represents the tweet ID.
‘Username’: Represents the username who published the tweet.
‘Text’: Represents the full text of the tweet (not pre-processed).
‘CreateDate’: Represents the date the tweet was created.
‘Biased’: Represents the labeled by our annotations if the tweet is antisemitic or non-antisemitic.
‘Keyword’: Represents the keyword that was used in the query. The keyword can be in the text, including mentioned names, or the username.
## Dataset Creation
This dataset contains 6,941 tweets that cover a wide range of topics common in conversations about Jews, Israel, and antisemitism between January 2019 and December 2021. The dataset is drawn from representative samples during this period with relevant keywords. 1,250 tweets (18%) meet the IHRA definition of antisemitic messages.
The dataset has been compiled within the ISCA project using an annotation portal to label tweets as either antisemitic or non-antisemitic. The original data was sourced from annotationportal.com.
### Annotations
#### Annotation process
We annotated the tweets, considering the text, images, videos, and links, in their “natural” context, including threads. We used a detailed annotation guideline, based on the IHRA Definition, which has been endorsed and recommended by more than 30 governments and international organizations5 and is frequently used to monitor and record antisemitic incidents. We divided the definition into 12 paragraphs. Each of the paragraphs addresses different forms and tropes of antisemitism. We created an online annotation tool (https://annotationportal.com) to make labeling easier, more consistent, and less prone to errors, including in the process of recording the annotations. The portal displays the tweet and a clickable annotation form, see Figure 1. It automatically saves each annotation, including the time spent labeling each tweet.
The Annotation Portal retrieves live tweets by referencing their ID number. Our annotators first look at the tweet, and if they are unsure of the meaning, they are prompted to look at the entire thread, replies, likes, links, and comments. A click on the visualized tweet opens a new tab in the browser, displaying the message on the Twitter page in its “natural” environment.
The portal is designed to help annotators consistently label messages as antisemitic or not according to the IHRA definition. After verifying that the message is still live and in English, they select from a drop-down menu where they classify the message as "confident antisemitic," "probably antisemitic," "probably not antisemitic," "confident not antisemitic," or "don’t know." The annotation guideline, including the definition, is linked in a PDF document.
#### Who are the annotators?
All annotators are familiar with the definition and have been trained on test samples. They have also taken at least one academic course on antisemitism or have done research on antisemitism. We consider them to be expert annotators. Eight such expert annotators of different religions and genders labeled the 18 samples, two for each sample in alternating configurations.
## Considerations for Using the Data
### Social Impact of Dataset
One of the major challenges in automatic hate speech detection is the lack of datasets that cover a wide range of biased and unbiased messages and that are consistently labeled. We propose a labeling procedure that addresses some of the common weaknesses of labeled datasets.
We focus on antisemitic speech on Twitter and create a labeled dataset of 6,941 tweets that cover a wide range of topics common in conversations about Jews, Israel, and antisemitism between January 2019 and December 2021 by drawing from representative samples with relevant keywords.
Our annotation process aims to strictly apply a commonly used definition of antisemitism by forcing annotators to specify which part of the definition applies, and by giving them the option to personally disagree with the definition on a case-by-case basis. Labeling tweets that call out antisemitism, report antisemitism, or are otherwise related to antisemitism (such as the Holocaust) but are not actually antisemitic can help reduce false positives in automated detection.
## Additional Information
### Dataset Curators
Gunther Jikeli, Sameer Karali, Daniel Miehling, and Katharina Soemer
### Citation Information
Jikeli,Gunther, Sameer Karali, Daniel Miehling, and Katharina Soemer (2023): Antisemitic Messages? A Guide to High-Quality Annotation and a Labeled Dataset of Tweets. https://arxiv.org/abs/2304.14599
|
Jing24/sort_high_all_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 79676027
num_examples: 87599
download_size: 32663100
dataset_size: 79676027
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sort_high_all_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pvduy/dpo_data_baai_50k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 7316399
num_examples: 2418
- name: test
num_bytes: 8556760
num_examples: 1964
- name: train_prefs
num_bytes: 84336313
num_examples: 50000
- name: test_prefs
num_bytes: 66468
num_examples: 10
download_size: 52935704
dataset_size: 100275940
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
likhithnemani/github_repo_dataset | ---
license: apache-2.0
dataset_info:
features:
- name: Repo Name
dtype: string
- name: File Names
dtype: string
- name: Project Description
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 19272245
num_examples: 1460
- name: test
num_bytes: 28422158
num_examples: 366
download_size: 8350638
dataset_size: 47694403
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
xL0G1Cx/embeddings | ---
license: mit
---
|
open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200 | ---
pretty_name: Evaluation run of cloudyu/Pluto_24B_DPO_200
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cloudyu/Pluto_24B_DPO_200](https://huggingface.co/cloudyu/Pluto_24B_DPO_200)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T17:18:01.366806](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200/blob/main/results_2024-01-18T17-18-01.366806.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6487883183265996,\n\
\ \"acc_stderr\": 0.03206766377553213,\n \"acc_norm\": 0.649809388886223,\n\
\ \"acc_norm_stderr\": 0.03271483221046768,\n \"mc1\": 0.5128518971848225,\n\
\ \"mc1_stderr\": 0.017497717944299822,\n \"mc2\": 0.6986184584005906,\n\
\ \"mc2_stderr\": 0.014631943760685329\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955003,\n\
\ \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156213\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6717785301732723,\n\
\ \"acc_stderr\": 0.004686062421158146,\n \"acc_norm\": 0.8637721569408484,\n\
\ \"acc_norm_stderr\": 0.0034232928816321398\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305526,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305526\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\
\ \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n\
\ \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667888,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667888\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922438,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922438\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5128518971848225,\n\
\ \"mc1_stderr\": 0.017497717944299822,\n \"mc2\": 0.6986184584005906,\n\
\ \"mc2_stderr\": 0.014631943760685329\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710683\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \
\ \"acc_stderr\": 0.013059111935831497\n }\n}\n```"
repo_url: https://huggingface.co/cloudyu/Pluto_24B_DPO_200
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|arc:challenge|25_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|gsm8k|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hellaswag|10_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T17-18-01.366806.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T17-18-01.366806.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- '**/details_harness|winogrande|5_2024-01-18T17-18-01.366806.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T17-18-01.366806.parquet'
- config_name: results
data_files:
- split: 2024_01_18T17_18_01.366806
path:
- results_2024-01-18T17-18-01.366806.parquet
- split: latest
path:
- results_2024-01-18T17-18-01.366806.parquet
---
# Dataset Card for Evaluation run of cloudyu/Pluto_24B_DPO_200
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Pluto_24B_DPO_200](https://huggingface.co/cloudyu/Pluto_24B_DPO_200) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T17:18:01.366806](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200/blob/main/results_2024-01-18T17-18-01.366806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6487883183265996,
"acc_stderr": 0.03206766377553213,
"acc_norm": 0.649809388886223,
"acc_norm_stderr": 0.03271483221046768,
"mc1": 0.5128518971848225,
"mc1_stderr": 0.017497717944299822,
"mc2": 0.6986184584005906,
"mc2_stderr": 0.014631943760685329
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955003,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.6717785301732723,
"acc_stderr": 0.004686062421158146,
"acc_norm": 0.8637721569408484,
"acc_norm_stderr": 0.0034232928816321398
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305526,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305526
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667888,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667888
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922438,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922438
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5128518971848225,
"mc1_stderr": 0.017497717944299822,
"mc2": 0.6986184584005906,
"mc2_stderr": 0.014631943760685329
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710683
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.013059111935831497
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
imZoe/actionbaseddataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 150194
num_examples: 101
download_size: 58013
dataset_size: 150194
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models | ---
pretty_name: Evaluation run of kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models](https://huggingface.co/kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T18:15:50.698529](https://huggingface.co/datasets/open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models/blob/main/results_2024-01-14T18-15-50.698529.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26648871501929594,\n\
\ \"acc_stderr\": 0.03093030883128489,\n \"acc_norm\": 0.2677809133729311,\n\
\ \"acc_norm_stderr\": 0.03175527446298885,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.4880571743853537,\n\
\ \"mc2_stderr\": 0.0172850771661607\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20819112627986347,\n \"acc_stderr\": 0.011864866118448064,\n\
\ \"acc_norm\": 0.2551194539249147,\n \"acc_norm_stderr\": 0.012739038695202105\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25692093208524197,\n\
\ \"acc_stderr\": 0.004360424536145122,\n \"acc_norm\": 0.2552280422226648,\n\
\ \"acc_norm_stderr\": 0.004350982826580604\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.03878139888797611,\n\
\ \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.03878139888797611\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.027851252973889774,\n\
\ \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.027851252973889774\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.31290322580645163,\n\
\ \"acc_stderr\": 0.026377567028645854,\n \"acc_norm\": 0.31290322580645163,\n\
\ \"acc_norm_stderr\": 0.026377567028645854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\"\
: 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.03447478286414359,\n\
\ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.03447478286414359\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.02407869658063547,\n \
\ \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.02407869658063547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3559633027522936,\n \"acc_stderr\": 0.020528559278244218,\n \"\
acc_norm\": 0.3559633027522936,\n \"acc_norm_stderr\": 0.020528559278244218\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.20675105485232068,\n \"acc_stderr\": 0.026361651668389104,\n\
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.026361651668389104\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.12556053811659193,\n\
\ \"acc_stderr\": 0.02223898546932376,\n \"acc_norm\": 0.12556053811659193,\n\
\ \"acc_norm_stderr\": 0.02223898546932376\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.15178571428571427,\n\
\ \"acc_stderr\": 0.03405702838185694,\n \"acc_norm\": 0.15178571428571427,\n\
\ \"acc_norm_stderr\": 0.03405702838185694\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n\
\ \"acc_stderr\": 0.025819233256483706,\n \"acc_norm\": 0.19230769230769232,\n\
\ \"acc_norm_stderr\": 0.025819233256483706\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20561941251596424,\n\
\ \"acc_stderr\": 0.014452500456785825,\n \"acc_norm\": 0.20561941251596424,\n\
\ \"acc_norm_stderr\": 0.014452500456785825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.022289638852617904,\n\
\ \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.022289638852617904\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n\
\ \"acc_stderr\": 0.010885929742002221,\n \"acc_norm\": 0.23859191655801826,\n\
\ \"acc_norm_stderr\": 0.010885929742002221\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.21405228758169934,\n \"acc_stderr\": 0.01659342966232903,\n \
\ \"acc_norm\": 0.21405228758169934,\n \"acc_norm_stderr\": 0.01659342966232903\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.03115715086935556,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.03115715086935556\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.03175554786629921,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.03175554786629921\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.4880571743853537,\n\
\ \"mc2_stderr\": 0.0172850771661607\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5019731649565904,\n \"acc_stderr\": 0.014052376259225636\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|arc:challenge|25_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|gsm8k|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hellaswag|10_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T18-15-50.698529.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T18-15-50.698529.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- '**/details_harness|winogrande|5_2024-01-14T18-15-50.698529.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T18-15-50.698529.parquet'
- config_name: results
data_files:
- split: 2024_01_14T18_15_50.698529
path:
- results_2024-01-14T18-15-50.698529.parquet
- split: latest
path:
- results_2024-01-14T18-15-50.698529.parquet
---
# Dataset Card for Evaluation run of kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models](https://huggingface.co/kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T18:15:50.698529](https://huggingface.co/datasets/open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models/blob/main/results_2024-01-14T18-15-50.698529.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26648871501929594,
"acc_stderr": 0.03093030883128489,
"acc_norm": 0.2677809133729311,
"acc_norm_stderr": 0.03175527446298885,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299953,
"mc2": 0.4880571743853537,
"mc2_stderr": 0.0172850771661607
},
"harness|arc:challenge|25": {
"acc": 0.20819112627986347,
"acc_stderr": 0.011864866118448064,
"acc_norm": 0.2551194539249147,
"acc_norm_stderr": 0.012739038695202105
},
"harness|hellaswag|10": {
"acc": 0.25692093208524197,
"acc_stderr": 0.004360424536145122,
"acc_norm": 0.2552280422226648,
"acc_norm_stderr": 0.004350982826580604
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.027851252973889774,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.027851252973889774
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.026377567028645854,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.026377567028645854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.03447478286414359,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.03447478286414359
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.02407869658063547,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.02407869658063547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3559633027522936,
"acc_stderr": 0.020528559278244218,
"acc_norm": 0.3559633027522936,
"acc_norm_stderr": 0.020528559278244218
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.026361651668389104,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.026361651668389104
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.12556053811659193,
"acc_stderr": 0.02223898546932376,
"acc_norm": 0.12556053811659193,
"acc_norm_stderr": 0.02223898546932376
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.15178571428571427,
"acc_stderr": 0.03405702838185694,
"acc_norm": 0.15178571428571427,
"acc_norm_stderr": 0.03405702838185694
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19230769230769232,
"acc_stderr": 0.025819233256483706,
"acc_norm": 0.19230769230769232,
"acc_norm_stderr": 0.025819233256483706
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20561941251596424,
"acc_stderr": 0.014452500456785825,
"acc_norm": 0.20561941251596424,
"acc_norm_stderr": 0.014452500456785825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.022289638852617904,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.022289638852617904
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.010885929742002221,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.010885929742002221
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21405228758169934,
"acc_stderr": 0.01659342966232903,
"acc_norm": 0.21405228758169934,
"acc_norm_stderr": 0.01659342966232903
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935556,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935556
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.03175554786629921,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.03175554786629921
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299953,
"mc2": 0.4880571743853537,
"mc2_stderr": 0.0172850771661607
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225636
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lmqg/qag_itquad | ---
license: cc-by-sa-4.0
pretty_name: SQuAD for question generation
language: it
multilinguality: monolingual
size_categories: 1k<n<10K
source_datasets: lmqg/qg_itquad
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- question-generation
---
# Dataset Card for "lmqg/qag_itquad"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is the question & answer generation dataset based on the ITQuAD.
### Supported Tasks and Leaderboards
* `question-answer-generation`: The dataset is assumed to be used to train a model for question & answer generation.
Success on this task is typically measured by achieving a high BLEU4/METEOR/ROUGE-L/BERTScore/MoverScore (see our paper for more in detail).
### Languages
Itallian (it)
## Dataset Structure
An example of 'train' looks as follows.
```
{
"paragraph": ""4 Minuti" è uscito come primo singolo dell' album e ha raggiunto il terzo posto sulla Billboard Hot 100. E' stato il 37° top-ten di Madonna che ha spinto Madonna oltre Elvis Presley come l' artista con i più top-ten hit. Nel Regno Unito ha mantenuto il suo record per il più numero uno single per una artista femminile;"4 Minuti" diventando il suo tredicesimo. Al 23° Japan Gold Disc Awards, Madonna ha ricevuto il suo quinto trofeo Artista dell' anno dalla Recording Industry Association of Japan, la più importante per qualsiasi artista. Per promuovere ulteriormente l' album, Madonna ha intrapreso il Sticky & Sweet Tour, la sua prima grande avventura con Live Nation. Con un lordo di 280 milioni di dollari, è diventato il tour più incassato di un artista solista, superando il precedente record di Madonna stabilito con il Confessions Tour; è stato poi superato da The Wall Live di Roger Waters. E' stato esteso al prossimo anno, aggiungendo nuove date europee, e dopo la fine, il totale lordo totale era di 408 milioni di dollari.",
"questions": [ "Qual è il nome del primo tour con Live Nation?", "4 minuti è diventato Madonna's che numero uno nel Regno Unito?", "Quanto ha incassato Stick e Sweet Tour?", "Madonna ha superato l' artista con i più alti dieci colpi?" ],
"answers": [ "Sticky & Sweet Tour", "tredicesimo", "280 milioni di dollari,", "Elvis Presley" ],
"questions_answers": "question: Qual è il nome del primo tour con Live Nation?, answer: Sticky & Sweet Tour | question: 4 minuti è diventato Madonna's che numero uno nel Regno Unito?, answer: tredicesimo | question: Quanto ha incassato Stick e Sweet Tour?, answer: 280 milioni di dollari, | question: Madonna ha superato l' artista con i più alti dieci colpi?, answer: Elvis Presley"
}
```
The data fields are the same among all splits.
- `questions`: a `list` of `string` features.
- `answers`: a `list` of `string` features.
- `paragraph`: a `string` feature.
- `questions_answers`: a `string` feature.
## Data Splits
|train|validation|test |
|----:|---------:|----:|
|16918 | 6280 | 1988|
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` |
AhmedSSoliman/CodeSearchNet-py | ---
license: ms-pl
dataset_info:
features:
- name: code
dtype: string
- name: docstring
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1288057395
num_examples: 457461
download_size: 547996237
dataset_size: 1288057395
---
|
irds/disks45_nocr_trec7 | ---
pretty_name: '`disks45/nocr/trec7`'
viewer: false
source_datasets: ['irds/disks45_nocr']
task_categories:
- text-retrieval
---
# Dataset Card for `disks45/nocr/trec7`
The `disks45/nocr/trec7` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/disks45#disks45/nocr/trec7).
# Data
This dataset provides:
- `queries` (i.e., topics); count=50
- `qrels`: (relevance assessments); count=80,345
- For `docs`, use [`irds/disks45_nocr`](https://huggingface.co/datasets/irds/disks45_nocr)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/disks45_nocr_trec7', 'queries')
for record in queries:
record # {'query_id': ..., 'title': ..., 'description': ..., 'narrative': ...}
qrels = load_dataset('irds/disks45_nocr_trec7', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@misc{Voorhees1996Disks45,
title = {NIST TREC Disks 4 and 5: Retrieval Test Collections Document Set},
author = {Ellen M. Voorhees},
doi = {10.18434/t47g6m},
year = {1996},
publisher = {National Institute of Standards and Technology}
}
@inproceedings{Voorhees1998Trec7,
title = {Overview of the Seventh Text Retrieval Conference (TREC-7)},
author = {Ellen M. Voorhees and Donna Harman},
year = {1998},
booktitle = {TREC}
}
```
|
gguichard/wsd_myriade_synth_data_gpt4turbo_xlm | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2400979
num_examples: 3391
download_size: 472673
dataset_size: 2400979
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wsd_myriade_synth_data_gpt4turbo_xlm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kamyar-zeinalipour/Protein | ---
dataset_info:
features:
- name: Cluster ID
dtype: string
- name: Cluster Name
dtype: string
- name: Types
dtype: string
- name: Size
dtype: int64
- name: Organisms
dtype: string
- name: Length
dtype: int64
- name: Identity
dtype: float64
- name: Reference sequence
dtype: string
- name: Common taxon ID
dtype: int64
- name: Common taxon
dtype: string
- name: Organism IDs
dtype: string
- name: Cluster members
dtype: string
- name: Date of creation
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 43805803
num_examples: 52000
- name: test
num_bytes: 1693705
num_examples: 1986
download_size: 27144249
dataset_size: 45499508
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Smuzzer/Rach | ---
license: openrail
---
|
Nicky0007/cointelegraph_news_English | ---
task_categories:
- token-classification
- question-answering
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset cointelegraph English
## Dataset Description
It is a dataset where information about the title, description, author, etc. is collected.
approx: 10041 row
page: https://cointelegraph.com/
categorie: #cryptocurrency, #Bitcoin, #Ethereum ...
|
open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser | ---
pretty_name: Evaluation run of Kquant03/Buttercup-V2-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kquant03/Buttercup-V2-laser](https://huggingface.co/Kquant03/Buttercup-V2-laser)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T07:34:11.973720](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser/blob/main/results_2024-02-16T07-34-11.973720.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535761549256881,\n\
\ \"acc_stderr\": 0.03205604876868876,\n \"acc_norm\": 0.6528640185317818,\n\
\ \"acc_norm_stderr\": 0.032733047429496384,\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6899750707536572,\n\
\ \"mc2_stderr\": 0.01507018824423322\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403511,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710698\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7135032861979685,\n\
\ \"acc_stderr\": 0.004512002459757956,\n \"acc_norm\": 0.8847839075881299,\n\
\ \"acc_norm_stderr\": 0.0031863002304505753\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163255,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163255\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.01655860163604103,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.01655860163604103\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959614,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959614\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n\
\ \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"\
acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6899750707536572,\n\
\ \"mc2_stderr\": 0.01507018824423322\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8626677190213102,\n \"acc_stderr\": 0.009673669315476049\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6808188021228203,\n \
\ \"acc_stderr\": 0.012840345676251653\n }\n}\n```"
repo_url: https://huggingface.co/Kquant03/Buttercup-V2-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|arc:challenge|25_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|gsm8k|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hellaswag|10_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T07-34-11.973720.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T07-34-11.973720.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- '**/details_harness|winogrande|5_2024-02-16T07-34-11.973720.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T07-34-11.973720.parquet'
- config_name: results
data_files:
- split: 2024_02_16T07_34_11.973720
path:
- results_2024-02-16T07-34-11.973720.parquet
- split: latest
path:
- results_2024-02-16T07-34-11.973720.parquet
---
# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Buttercup-V2-laser](https://huggingface.co/Kquant03/Buttercup-V2-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T07:34:11.973720](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser/blob/main/results_2024-02-16T07-34-11.973720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535761549256881,
"acc_stderr": 0.03205604876868876,
"acc_norm": 0.6528640185317818,
"acc_norm_stderr": 0.032733047429496384,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6899750707536572,
"mc2_stderr": 0.01507018824423322
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403511,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710698
},
"harness|hellaswag|10": {
"acc": 0.7135032861979685,
"acc_stderr": 0.004512002459757956,
"acc_norm": 0.8847839075881299,
"acc_norm_stderr": 0.0031863002304505753
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977938,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977938
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163255,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604103,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6899750707536572,
"mc2_stderr": 0.01507018824423322
},
"harness|winogrande|5": {
"acc": 0.8626677190213102,
"acc_stderr": 0.009673669315476049
},
"harness|gsm8k|5": {
"acc": 0.6808188021228203,
"acc_stderr": 0.012840345676251653
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
psroy/mini-platypus-guanaco-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1343094
num_examples: 700
download_size: 753853
dataset_size: 1343094
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingartists/bring-me-the-horizon | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/bring-me-the-horizon"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.269517 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/64c7d35c8d427522574cbf7773084ee3.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/bring-me-the-horizon">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bring Me The Horizon</div>
<a href="https://genius.com/artists/bring-me-the-horizon">
<div style="text-align: center; font-size: 14px;">@bring-me-the-horizon</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/bring-me-the-horizon).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bring-me-the-horizon")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|173| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/bring-me-the-horizon")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
wangshasha3575/test | ---
license: bigscience-bloom-rail-1.0
---
|
ShivamChattar/Grouping | ---
license: cc0-1.0
---
|
rajendrabaskota/hc3-wiki-intro-test-tokenized | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 22002752
num_examples: 10433
download_size: 11900271
dataset_size: 22002752
---
# Dataset Card for "hc3-wiki-intro-test-tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-machine_learning | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 6406
num_examples: 5
- name: test
num_bytes: 582534
num_examples: 112
download_size: 90771
dataset_size: 588940
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-machine_learning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PeterPanTheGenius/WISE2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 71832426.0
num_examples: 996
download_size: 71786826
dataset_size: 71832426.0
---
# Dataset Card for "WISE2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vme50/github-trending-2024 | ---
license: apache-2.0
---
|
ninjaiam/attempt_1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1639666
num_examples: 2309
download_size: 834306
dataset_size: 1639666
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-34156b-59952145381 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: Alred/bart-base-finetuned-summarization-cnn-ver3
metrics: ['rouge', 'accuracy', 'bleu', 'exact_match', 'f1', 'perplexity', 'recall', 'precision', 'roc_auc']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Alred/bart-base-finetuned-summarization-cnn-ver3
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sini raj p](https://huggingface.co/sini raj p) for evaluating this model. |
ftopal/huggingface-models | ---
dataset_info:
features:
- name: sha
dtype: 'null'
- name: last_modified
dtype: 'null'
- name: library_name
dtype: string
- name: text
dtype: string
- name: metadata
dtype: string
- name: pipeline_tag
dtype: string
- name: id
dtype: string
- name: tags
sequence: string
- name: created_at
dtype: string
- name: arxiv
sequence: string
- name: languages
sequence: string
- name: tags_str
dtype: string
- name: text_str
dtype: string
- name: text_lists
sequence: string
- name: processed_texts
sequence: string
splits:
- name: train
num_bytes: 1596226483
num_examples: 240530
download_size: 441807832
dataset_size: 1596226483
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yashm/phrases |
---
license: cc-by-sa-4.0
task_categories:
- text-generation
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
This dataset card provides an overview of the Research Phrases Dataset, designed for training and evaluating language models (LLMs) to generate contextually relevant phrases for various sections of research papers, particularly within the fields of biology and bioinformatics. The dataset includes structured inputs with metadata and prompts to guide the model in generating outputs tailored to the specific needs of academic writing.
### Dataset Description
The Research Phrases Dataset comprises thousands of phrases structured to assist in the generation of academic content across different sections of research papers. Each entry is designed with a conditional generation approach, incorporating metadata such as the field of study, keywords, and structured prompts. This method aims to enhance the model's ability to produce section-specific text, making it a valuable resource for automating parts of the research writing process.
## Uses
The Research Phrases Dataset is intended for direct use in training and evaluating language models geared towards academic writing assistance.
### Direct Use
It can be particularly useful in applications such as:
Automated Writing Tools: Supporting the development of tools that assist researchers in drafting various sections of their papers by providing contextually relevant phrases and sentences.
Educational Purposes: Aiding in the education of students and early-career researchers in the structuring and writing of academic papers by offering examples of how specific sections can be articulated.
Content Generation: Facilitating the generation of draft content for research papers, abstracts, and proposals, especially in the fields of biology and bioinformatics.
|
imvladikon/bmc | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- he
license:
- other
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-reuters-corpus
task_categories:
- token-classification
task_ids:
- named-entity-recognition
train-eval-index:
- config: bmc
task: token-classification
task_id: entity_extraction
splits:
train_split: train
eval_split: validation
test_split: test
col_mapping:
tokens: tokens
ner_tags: tags
metrics:
- type: seqeval
name: seqeval
---
# Splits for the Ben-Mordecai and Elhadad Hebrew NER Corpus (BMC)
In order to evaluate performance in accordance with the original Ben-Mordecai and Elhadad (2005) work, we provide three 75%-25% random splits.
* Only the 7 entity categories viable for evaluation were kept (DATE, LOC, MONEY, ORG, PER, PERCENT, TIME) --- all MISC entities were filtered out.
* Sequence label scheme was changed from IOB to BIOES
* The dev sets are 10% taken out of the 75%
## Citation
If you use use the BMC corpus, please cite the original paper as well as our paper which describes the splits:
* Ben-Mordecai and Elhadad (2005):
```console
@mastersthesis{naama,
title={Hebrew Named Entity Recognition},
author={Ben-Mordecai, Naama},
advisor={Elhadad, Michael},
year={2005},
url="https://www.cs.bgu.ac.il/~elhadad/nlpproj/naama/",
institution={Department of Computer Science, Ben-Gurion University},
school={Department of Computer Science, Ben-Gurion University},
}
```
* Bareket and Tsarfaty (2020)
```console
@misc{bareket2020neural,
title={Neural Modeling for Named Entities and Morphology (NEMO^2)},
author={Dan Bareket and Reut Tsarfaty},
year={2020},
eprint={2007.15620},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Shumit/Heart-Failure-Text | ---
license: unknown
---
|
edbeeching/prj_gia_dataset_atari_2B_atari_fishingderby_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the atari_fishingderby environment, sample for the policy atari_2B_atari_fishingderby_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
liuyanchen1015/MULTI_VALUE_mrpc_for_to | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 149903
num_examples: 514
- name: train
num_bytes: 318424
num_examples: 1106
- name: validation
num_bytes: 38765
num_examples: 134
download_size: 333872
dataset_size: 507092
---
# Dataset Card for "MULTI_VALUE_mrpc_for_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_rare_v5_full_recite_full_passage_random_permute_rerun_4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7715171.746088194
num_examples: 4345
- name: validation
num_bytes: 582950
num_examples: 300
download_size: 1699958
dataset_size: 8298121.746088194
---
# Dataset Card for "squad_qa_rare_v5_full_recite_full_passage_random_permute_rerun_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argilla/banking77_MiniLM_embeddings | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': activate_my_card
'1': age_limit
'2': apple_pay_or_google_pay
'3': atm_support
'4': automatic_top_up
'5': balance_not_updated_after_bank_transfer
'6': balance_not_updated_after_cheque_or_cash_deposit
'7': beneficiary_not_allowed
'8': cancel_transfer
'9': card_about_to_expire
'10': card_acceptance
'11': card_arrival
'12': card_delivery_estimate
'13': card_linking
'14': card_not_working
'15': card_payment_fee_charged
'16': card_payment_not_recognised
'17': card_payment_wrong_exchange_rate
'18': card_swallowed
'19': cash_withdrawal_charge
'20': cash_withdrawal_not_recognised
'21': change_pin
'22': compromised_card
'23': contactless_not_working
'24': country_support
'25': declined_card_payment
'26': declined_cash_withdrawal
'27': declined_transfer
'28': direct_debit_payment_not_recognised
'29': disposable_card_limits
'30': edit_personal_details
'31': exchange_charge
'32': exchange_rate
'33': exchange_via_app
'34': extra_charge_on_statement
'35': failed_transfer
'36': fiat_currency_support
'37': get_disposable_virtual_card
'38': get_physical_card
'39': getting_spare_card
'40': getting_virtual_card
'41': lost_or_stolen_card
'42': lost_or_stolen_phone
'43': order_physical_card
'44': passcode_forgotten
'45': pending_card_payment
'46': pending_cash_withdrawal
'47': pending_top_up
'48': pending_transfer
'49': pin_blocked
'50': receiving_money
'51': Refund_not_showing_up
'52': request_refund
'53': reverted_card_payment?
'54': supported_cards_and_currencies
'55': terminate_account
'56': top_up_by_bank_transfer_charge
'57': top_up_by_card_charge
'58': top_up_by_cash_or_cheque
'59': top_up_failed
'60': top_up_limits
'61': top_up_reverted
'62': topping_up_by_card
'63': transaction_charged_twice
'64': transfer_fee_charged
'65': transfer_into_account
'66': transfer_not_received_by_recipient
'67': transfer_timing
'68': unable_to_verify_identity
'69': verify_my_identity
'70': verify_source_of_funds
'71': verify_top_up
'72': virtual_card_not_working
'73': visa_or_mastercard
'74': why_verify_identity
'75': wrong_amount_of_cash_received
'76': wrong_exchange_rate_for_cash_withdrawal
- name: vectors
struct:
- name: mini-lm-sentence-transformers
sequence: float64
splits:
- name: test
num_bytes: 9678090
num_examples: 3080
download_size: 8319885
dataset_size: 9678090
---
# Dataset Card for "banking77_MiniLM_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NyanNyanovich/nyan_documents | ---
dataset_info:
features:
- name: url
dtype: string
- name: channel_id
dtype: string
- name: post_id
dtype: int64
- name: views
dtype: int64
- name: pub_time
dtype: int64
- name: text
dtype: string
- name: fetch_time
dtype: int64
- name: images
sequence: string
- name: links
sequence: string
- name: videos
sequence: string
- name: reply_to
dtype: string
- name: forward_from
dtype: string
- name: channel_title
dtype: string
- name: has_obscene
dtype: bool
- name: patched_text
dtype: string
- name: groups
struct:
- name: economy
dtype: string
- name: main
dtype: string
- name: tech
dtype: string
- name: issue
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 3508000056
num_examples: 1672028
download_size: 1827333867
dataset_size: 3508000056
license: cc-by-4.0
task_categories:
- text-generation
language:
- ru
pretty_name: Nyan Documents
size_categories:
- 1M<n<10M
---
# Nyan documents
Documents scraped for [НЯН](https://t.me/nyannews) Telegram channel from March 2022 to December 2023. The dataset includes documents from 100+ different Telegram news channels.
## Usage
```bash
pip3 install datasets
```
```python
from datasets import load_dataset
for row in load_dataset("NyanNyanovich/nyan_documents", split="train", streaming=True):
print(row)
break
```
## Other datasets
* Documents (this dataset): https://huggingface.co/datasets/NyanNyanovich/nyan_documents
* Clusters: https://huggingface.co/datasets/NyanNyanovich/nyan_clusters |
HerczogC/NPL_UBA_2023 | ---
license: apache-2.0
---
|
CyberHarem/kirino_aya_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kirino_aya/桐野アヤ (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kirino_aya/桐野アヤ (THE iDOLM@STER: Cinderella Girls), containing 24 images and their tags.
The core tags of this character are `black_hair, long_hair, brown_eyes, earrings, single_hair_bun, hair_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 21.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 16.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 52 | 31.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 19.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 52 | 37.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirino_aya_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kirino_aya_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, jewelry, gloves, one_eye_closed, smile, breasts, card_(medium), character_name, dress, gem_(symbol) |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | jewelry | gloves | one_eye_closed | smile | breasts | card_(medium) | character_name | dress | gem_(symbol) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:---------|:-----------------|:--------|:----------|:----------------|:-----------------|:--------|:---------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
s-nlp/Mintaka_Graph_Features_T5-xl-ssm | ---
dataset_info:
features:
- name: question
dtype: string
- name: question_answer
dtype: string
- name: num_nodes
dtype: int64
- name: num_edges
dtype: int64
- name: density
dtype: float64
- name: cycle
dtype: int64
- name: bridge
dtype: int64
- name: katz_centrality
dtype: float64
- name: page_rank
dtype: float64
- name: avg_ssp_length
dtype: float64
- name: determ_sequence
dtype: string
- name: gap_sequence
dtype: string
- name: g2t_sequence
dtype: string
- name: determ_sequence_embedding
dtype: string
- name: gap_sequence_embedding
dtype: string
- name: g2t_sequence_embedding
dtype: string
- name: question_answer_embedding
dtype: string
- name: tfidf_vector
dtype: string
- name: correct
dtype: float64
splits:
- name: train
num_bytes: 8547767219
num_examples: 75582
- name: validation
num_bytes: 520992433
num_examples: 13439
- name: test
num_bytes: 2442628533
num_examples: 21574
download_size: 1886431078
dataset_size: 11511388185
---
# Dataset Card for "Mintaka_Graph_Features_T5-xl-ssm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VivendoDigital/belebele-chat-ita-sft2 | ---
license: apache-2.0
---
|
keremberke/nfl-object-detection | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
---
<div align="center">
<img width="640" alt="keremberke/nfl-object-detection" src="https://huggingface.co/datasets/keremberke/nfl-object-detection/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['helmet', 'helmet-blurred', 'helmet-difficult', 'helmet-partial', 'helmet-sideline']
```
### Number of Images
```json
{'valid': 1989, 'train': 6963, 'test': 995}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("keremberke/nfl-object-detection", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/home-mxzv1/nfl-competition/dataset/1](https://universe.roboflow.com/home-mxzv1/nfl-competition/dataset/1?ref=roboflow2huggingface?ref=roboflow2huggingface)
### Citation
```
@misc{ nfl-competition_dataset,
title = { NFL-competition Dataset },
type = { Open Source Dataset },
author = { home },
howpublished = { \\url{ https://universe.roboflow.com/home-mxzv1/nfl-competition } },
url = { https://universe.roboflow.com/home-mxzv1/nfl-competition },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { sep },
note = { visited on 2023-01-18 },
}
```
### License
Public Domain
### Dataset Summary
This dataset was exported via roboflow.com on December 29, 2022 at 8:12 PM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
It includes 9947 images.
Helmets are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 1280x720 (Stretch)
No image augmentation techniques were applied.
|
bobytest/dataset | ---
license: artistic-2.0
---
|
TinyPixel/tc-2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1460073719
num_examples: 632309
download_size: 661114976
dataset_size: 1460073719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tc-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
khalidalt/HPLT_mt | ---
dataset_info:
features:
- name: translation
struct:
- name: ar
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 3999943063
num_examples: 14645275
download_size: 2464581714
dataset_size: 3999943063
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
danaroth/pavia | ---
license: unknown
---
# Description
The Pavia Centre and University are two scenes acquired by the [ROSIS](http://www.opairs.aero/rosis_en.html) sensor during a flight campaign over Pavia, nothern Italy. The number of spectral bands is 102 for Pavia Centre and 103 for Pavia University. Pavia Centre is a 1096 $\times$ 1096 pixels image, and Pavia University is 610 $\times$ 610 pixels, but some of the samples in both images contain no information and have to be discarded before the analysis. The geometric resolution is 1.3 meters. Both image groundtruths differenciate 9 classes each. It can be seen the discarded samples in the figures as abroad black strips.
# Characteristics
**Groundtruth classes for the Pavia centre scene and their respective samples number**
| # | Class | Samples |
|---|----------------------|---------|
| 1 | Water | 824 |
| 2 | Trees | 820 |
| 3 | Asphalt | 816 |
| 4 | Self-Blocking Bricks | 808 |
| 5 | Bitumen | 808 |
| 6 | Tiles | 1260 |
| 7 | Shadows | 476 |
| 8 | Meadows | 824 |
| 9 | Bare Soil | 820 |
**Groundtruth classes for the Pavia University scene and their respective samples number**
| # | Class | Samples |
|---|----------------------|---------|
| 1 | Asphalt | 6631 |
| 2 | Meadows | 18649 |
| 3 | Gravel | 2099 |
| 4 | Trees | 3064 |
| 5 | Painted metal sheets | 1345 |
| 6 | Bare Soil | 5029 |
| 7 | Bitumen | 1330 |
| 8 | Self-Blocking Bricks | 3682 |
| 9 | Shadows | 947 |
# Quick look
<figure>
<img src= "assets/Pavia_60.png" alt="Pavia" width="300" />
<figcaption>Sample band of Pavia Centre dataset.</figcaption>
</figure>
<figure>
<img src= "assets/Pavia_gt.png" alt="Pavia gt" width="300" />
<figcaption>Groundtruth of Pavia Centre dataset.</figcaption>
</figure>
<figure>
<img src= "assets/PaviaU_60.png" alt="PaviaU" width="300" />
<figcaption>Sample band of Pavia University dataset.</figcaption>
</figure>
<figure>
<img src= "assets/PaviaU_gt.png" alt="PaviaU gt" width="300" />
<figcaption>Groundtruth of Pavia University dataset.</figcaption>
</figure>
# Credits
Pavia scenes were provided by [Prof. Paolo Gamba](http://tlclab.unipv.it/sito_tlc/people.do?id=pgamba) from the [Telecommunications and Remote Sensing Laboratory](http://tlclab.unipv.it/), [Pavia university](http://www.unipv.eu/) (Italy).
This dataset was originally collected by Manuel Graña, Miguel-Angel Veganzones, Borja Ayerdi.
The original link for the dataset is available below:
https://www.ehu.eus/ccwintco/index.php/Hyperspectral_Remote_Sensing_Scenes |
itsankitkp/github-issues | ---
dataset_info:
features:
- name: comments_url
dtype: string
- name: timeline_url
dtype: string
- name: closed_at
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: node_id
dtype: string
- name: state
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: draft
dtype: bool
- name: number
dtype: int64
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: events_url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: labels_url
dtype: string
- name: created_at
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: id
dtype: int64
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: repository_url
dtype: string
- name: author_association
dtype: string
- name: body
dtype: string
- name: updated_at
dtype: string
- name: html_url
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 48331359
num_examples: 10200
download_size: 13328506
dataset_size: 48331359
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-52000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 662404
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
severo/dummy_public_renamed | ---
annotations_creators:
- no-annotation
license: cc0-1.0
size_categories:
- n<1K
source_datasets:
- original
pretty_name: Digitised Books - Images identified as Embellishments. c. 1510 - c. 1900. JPG
---
# Dataset Card for severo/embellishments
Test: link to a space:
https://huggingface.co/spaces/severo/voronoi-cloth
https://severo-voronoi-cloth.hf.space
## Dataset Description
- **Homepage:** [Digitised Books - Images identified as Embellishments - Homepage](https://bl.iro.bl.uk/concern/datasets/59d1aa35-c2d7-46e5-9475-9d0cd8df721e)
- **Point of Contact:** [Sylvain Lesage](mailto:sylvain.lesage@huggingface.co)
### Dataset Summary
This small dataset contains the thumbnails of the first 100 entries of [Digitised Books - Images identified as Embellishments. c. 1510 - c. 1900. JPG](https://bl.iro.bl.uk/concern/datasets/59d1aa35-c2d7-46e5-9475-9d0cd8df721e). It has been uploaded to the Hub to reproduce the tutorial by Daniel van Strien: [Using 🤗 datasets for image search](https://danielvanstrien.xyz/metadata/deployment/huggingface/ethics/huggingface-datasets/faiss/2022/01/13/image_search.html).
## Dataset Structure
### Data Instances
A typical row contains an image thumbnail, its filename, and the year of publication of the book it was extracted from.
An example looks as follows:
```
{
'fname': '000811462_05_000205_1_The Pictorial History of England being a history of the people as well as a hi_1855.jpg',
'year': '1855',
'path': 'embellishments/1855/000811462_05_000205_1_The Pictorial History of England being a history of the people as well as a hi_1855.jpg',
'img': ...
}
```
### Data Fields
- `fname`: the image filename.
- `year`: a string with the year of publication of the book from which the image has been extracted
- `path`: local path to the image
- `img`: a thumbnail of the image with a max height and width of 224 pixels
### Data Splits
The dataset only contains 100 rows, in a single 'train' split.
## Dataset Creation
### Curation Rationale
This dataset was chosen by Daniel van Strien for his tutorial [Using 🤗 datasets for image search](https://danielvanstrien.xyz/metadata/deployment/huggingface/ethics/huggingface-datasets/faiss/2022/01/13/image_search.html), which includes the code in Python to do it.
### Source Data
#### Initial Data Collection and Normalization
As stated on the British Library webpage:
> The images were algorithmically gathered from 49,455 digitised books, equating to 65,227 volumes (25+ million pages), published between c. 1510 - c. 1900. The books cover a wide range of subject areas including philosophy, history, poetry and literature. The images are in .JPEG format.d BCP-47 code is `en`.
#### Who are the source data producers?
British Library, British Library Labs, Adrian Edwards (Curator), Neil Fitzgerald (Contributor ORCID)
### Annotations
The dataset does not contain any additional annotations.
#### Annotation process
[N/A]
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
[N/A]
### Discussion of Biases
[N/A]
### Other Known Limitations
This is a toy dataset that aims at:
- validating the process described in the tutorial [Using 🤗 datasets for image search](https://danielvanstrien.xyz/metadata/deployment/huggingface/ethics/huggingface-datasets/faiss/2022/01/13/image_search.html) by Daniel van Strien,
- showing the [dataset viewer](https://huggingface.co/datasets/severo/embellishments/viewer/severo--embellishments/train) on an image dataset.
## Additional Information
### Dataset Curators
The dataset was created by Sylvain Lesage at Hugging Face, to replicate the tutorial [Using 🤗 datasets for image search](https://danielvanstrien.xyz/metadata/deployment/huggingface/ethics/huggingface-datasets/faiss/2022/01/13/image_search.html) by Daniel van Strien.
### Licensing Information
CC0 1.0 Universal Public Domain
|
lucasmccabe-lmi/gpt4all_code | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 294812377.0
num_examples: 93257
download_size: 143503343
dataset_size: 294812377.0
---
# Dataset Card for "gpt4all_code"
We provide a code-related subset of the original [nomic-ai/gpt4all-j-prompt-generations](https://huggingface.co/datasets/nomic-ai/gpt4all-j-prompt-generations#dataset-card-for-gpt4all-j-prompt-generations) (v1.2-jazzy revision) dataset, which represents those records whose prompts were sourced from [pacovaldez/stackoverflow-questions](https://huggingface.co/datasets/pacovaldez/stackoverflow-questions) and who explicitly mention one of Python, Java, C++, SQL, Kotlin, PHP, Swift, MATLAB, Typescript, Scala, HTML, CSS, Rust, or Perl. Output records are responses from OpenAI’s GPT3.5-Turbo. Prompt/response pairs have been reformatted to fit the Alpaca format.
Numbers:
- **Prompts**: 93257
- **Tokens**: 87686551 using the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer (counting instruction+input+output) |
RamazanTM/EngRussPretrain | ---
license: openrail
---
|
dmayhem93/toolformer-v0-postprocessed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 79229133
num_examples: 2245
download_size: 33861921
dataset_size: 79229133
---
# Dataset Card for "toolformer-v0-postprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/2M_magic_nights_SDXL_refiner_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1296156754
num_examples: 2000000
download_size: 148088218
dataset_size: 1296156754
---
# Dataset Card for "2M_magic_nights_SDXL_refiner_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KeshavRa/YSA_Supporters_Database | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 6457
num_examples: 11
download_size: 6833
dataset_size: 6457
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
g4drone/voz | ---
license: openrail
---
|
EiffL/hsc | ---
license: mit
dataset_info:
features:
- name: image
sequence:
- name: band
dtype: string
- name: array
dtype:
array2_d:
shape:
- 144
- 144
dtype: float32
- name: psf_fwhm
dtype: float32
- name: scale
dtype: float32
- name: a_g
dtype: float32
- name: a_r
dtype: float32
- name: a_i
dtype: float32
- name: a_z
dtype: float32
- name: a_y
dtype: float32
- name: g_extendedness_value
dtype: float32
- name: r_extendedness_value
dtype: float32
- name: i_extendedness_value
dtype: float32
- name: z_extendedness_value
dtype: float32
- name: y_extendedness_value
dtype: float32
- name: g_cmodel_mag
dtype: float32
- name: g_cmodel_magerr
dtype: float32
- name: r_cmodel_mag
dtype: float32
- name: r_cmodel_magerr
dtype: float32
- name: i_cmodel_mag
dtype: float32
- name: i_cmodel_magerr
dtype: float32
- name: z_cmodel_mag
dtype: float32
- name: z_cmodel_magerr
dtype: float32
- name: y_cmodel_mag
dtype: float32
- name: y_cmodel_magerr
dtype: float32
- name: g_sdssshape_psf_shape11
dtype: float32
- name: g_sdssshape_psf_shape22
dtype: float32
- name: g_sdssshape_psf_shape12
dtype: float32
- name: r_sdssshape_psf_shape11
dtype: float32
- name: r_sdssshape_psf_shape22
dtype: float32
- name: r_sdssshape_psf_shape12
dtype: float32
- name: i_sdssshape_psf_shape11
dtype: float32
- name: i_sdssshape_psf_shape22
dtype: float32
- name: i_sdssshape_psf_shape12
dtype: float32
- name: z_sdssshape_psf_shape11
dtype: float32
- name: z_sdssshape_psf_shape22
dtype: float32
- name: z_sdssshape_psf_shape12
dtype: float32
- name: y_sdssshape_psf_shape11
dtype: float32
- name: y_sdssshape_psf_shape22
dtype: float32
- name: y_sdssshape_psf_shape12
dtype: float32
- name: g_sdssshape_shape11
dtype: float32
- name: g_sdssshape_shape22
dtype: float32
- name: g_sdssshape_shape12
dtype: float32
- name: r_sdssshape_shape11
dtype: float32
- name: r_sdssshape_shape22
dtype: float32
- name: r_sdssshape_shape12
dtype: float32
- name: i_sdssshape_shape11
dtype: float32
- name: i_sdssshape_shape22
dtype: float32
- name: i_sdssshape_shape12
dtype: float32
- name: z_sdssshape_shape11
dtype: float32
- name: z_sdssshape_shape22
dtype: float32
- name: z_sdssshape_shape12
dtype: float32
- name: y_sdssshape_shape11
dtype: float32
- name: y_sdssshape_shape22
dtype: float32
- name: y_sdssshape_shape12
dtype: float32
- name: object_id
dtype: string
splits:
- name: train
num_bytes: 199401799968
num_examples: 477104
download_size: 198552341806
dataset_size: 199401799968
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.