datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
rayjhon/holland | ---
license: apache-2.0
---
|
YashRawal225/translation | ---
license: apache-2.0
---
|
sinhala-nlp/SOLD | ---
task_categories:
- text-classification
- token-classification
language:
- si
---
# SOLD - A Benchmark for Sinhala Offensive Language Identification
In this repository, we introduce the {S}inhala {O}ffensive {L}anguage {D}ataset **(SOLD)** and present multiple experiments on this dataset. **SOLD** is a manually annotated dataset containing 10,000 posts from Twitter annotated as offensive and not offensive at both sentence-level and token-level. **SOLD** is the largest offensive language dataset compiled for Sinhala. We also introduce **SemiSOLD**, a larger dataset containing more than 145,000 Sinhala tweets, annotated following a semi-supervised approach.
:warning: This repository contains texts that may be offensive and harmful.
## Annotation
We use an annotation scheme split into two levels deciding (a) Offensiveness of a tweet (sentence-level) and (b) Tokens that contribute to the offence at sentence-level (token-level).
### Sentence-level
Our sentence-level offensive language detection follows level A in OLID [(Zampieri et al., 2019)](https://aclanthology.org/N19-1144/). We asked annotators to discriminate between the following types of tweets:
* **Offensive (OFF)**: Posts containing any form of non-acceptable language (profanity) or a targeted offence, which can be veiled or direct. This includes insults, threats, and posts containing profane language or swear words.
* **Not Offensive (NOT)**: Posts that do not contain offense or profanity.
Each tweet was annotated with one of the above labels, which we used as the labels in sentence-level offensive language identification.
### Token-level
To provide a human explanation of labelling, we collect rationales for the offensive language. Following HateXplain [(Mathew et al., 2021)](https://ojs.aaai.org/index.php/AAAI/article/view/17745), we define a rationale as a specific text segment that justifies the human annotator’s decision of the sentence-level labels. Therefore, We ask the annotators to highlight particular tokens in a tweet that supports their judgement about the sentence-level label (offensive, not offensive). Specifically, if a tweet is offensive, we guide the annotators to highlight tokens from the text that supports the judgement while including non-verbal expressions such as emojis and morphemes that are used to convey the intention as well. We use this as token-level offensive labels in SOLD.

## Data
SOLD is released in HuggingFace. It can be loaded in to pandas dataframes using the following code.
```python
from datasets import Dataset
from datasets import load_dataset
sold_train = Dataset.to_pandas(load_dataset('sinhala-nlp/SOLD', split='train'))
sold_test = Dataset.to_pandas(load_dataset('sinhala-nlp/SOLD', split='test'))
```
The dataset contains of the following columns.
* **post_id** - Twitter ID
* **text** - Post text
* **tokens** - Tokenised text. Each token is seperated by a space.
* **rationals** - Offensive tokens. If a token is offensive it is shown as 1 and 0 otherwise.
* **label** - Sentence-level label, offensive or not-offensive.

SemiSOLD is also released HuggingFace and can be loaded to a pandas dataframe using the following code.
```python
from datasets import Dataset
from datasets import load_dataset
semi_sold = Dataset.to_pandas(load_dataset('sinhala-nlp/SemiSOLD', split='train'))
```
The dataset contains following columns
* **post_id** - Twitter ID
* **text** - Post text
Furthermore it contains predicted offensiveness scores from nine classifiers trained on SOLD train; xlmr, xlmt, mbert, sinbert, lstm_ft, cnn_ft, lstm_cbow, cnn_cbow, lstm_sl, cnn_sl and svm
## Experiments
Clone the repository and install the libraries using the following command (preferably inside a conda environment)
~~~
pip install -r requirements.txt
~~~
### Sentence-level
Sentence-level transformer based experiments can be executed using the following command.
~~~
python -m experiments.sentence_level.sinhala_deepoffense
~~~
The command takes the following arguments;
~~~
--model_type : Type of the transformer model (bert, xlmroberta, roberta etc ).
--model_name : The exact architecture and trained weights to use. This may be a Hugging Face Transformers compatible pre-trained model, a community model, or the path to a directory containing model files.
--transfer : Whether to perform transfer learning or not (true or false).
--transfer_language : The initial language if transfer learning is performed (hi, en or si).
* hi - Perform transfer learning from HASOC 2019 Hindi dataset (Modha et al., 2019).
* en - Perform transfer learning from Offenseval English dataset (Zampieri et al., 2019).
* si - Perform transfer learning from CCMS Sinhala dataset (Rathnayake et al., 2021).
--augment : Perform semi supervised data augmentation.
--std : Standard deviation of the models to cut down data augmentation.
--augment_type: The type of the data augmentation.
* off - Augment only the offensive instances.
* normal - Augment both offensive and non-offensive instances.
~~~
Sentence-level CNN and LSTM based experiments can be executed using the following command.
~~~
python -m experiments.sentence_level.sinhala_offensive_nn
~~~
The command takes the following arguments;
~~~
--model_type : Type of the architecture (cnn2D, lstm).
--model_name : The exact word embeddings to use. This may be a gensim model, or the path to a word embeddinng files.
--augment : Perform semi supervised data augmentation.
--std : Standard deviation of the models to cut down data augmentation.
--augment_type: The type of the data augmentation.
* off - Augment only the offensive instances.
* normal - Augment both offensive and non-offensive instances.
~~~
### Token-level
Token-level transformer based experiments can be executed using the following command.
~~~
python -m experiments.sentence_level.sinhala_mudes
~~~
The command takes the following arguments;
~~~
--model_type : Type of the transformer model (bert, xlmroberta, roberta etc ).
--model_name : The exact architecture and trained weights to use. This may be a Hugging Face Transformers compatible pre-trained model, a community model, or the path to a directory containing model files.
--transfer : Whether to perform transfer learning or not (true or false).
--transfer_language : The initial language if transfer learning is performed (hatex or tsd).
* hatex - Perform transfer learning from HateXplain dataset (Mathew et al., 2021).
* tsd - Perform transfer learning from TSD dataset (Pavlopoulos et al., 2021).
~~~
Token-level LIME experiments can be executed using the following command.
~~~
python -m experiments.sentence_level.sinhala_lime
~~~
The command takes the following arguments;
~~~
--model_type : Type of the transformer model (bert, xlmroberta, roberta etc ).
--model_name : The exact architecture and trained weights to use. This may be a Hugging Face Transformers compatible pre-trained model, a community model, or the path to a directory containing model files.
~~~
## Acknowledgments
We want to acknowledge Janitha Hapuarachchi, Sachith Suraweera, Chandika Udaya Kumara and Ridmi Randima, the team of volunteer annotators that provided their free time and efforts to help us produce SOLD.
## Citation
If you are using the dataset or the models please cite the following paper
~~~
@article{ranasinghe2022sold,
title={SOLD: Sinhala Offensive Language Dataset},
author={Ranasinghe, Tharindu and Anuradha, Isuri and Premasiri, Damith and Silva, Kanishka and Hettiarachchi, Hansi and Uyangodage, Lasitha and Zampieri, Marcos},
journal={arXiv preprint arXiv:2212.00851},
year={2022}
}
~~~ |
heliosprime/twitter_dataset_1713142535 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9407
num_examples: 27
download_size: 11541
dataset_size: 9407
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713142535"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shredder-31/Mic_QG_QusestionsData | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 9999795
num_examples: 5940
- name: dev
num_bytes: 4972591
num_examples: 2970
- name: test
num_bytes: 3330037
num_examples: 1980
download_size: 9643395
dataset_size: 18302423
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
Ambroz/DusanovaZgodba-2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 323842
num_examples: 1345
download_size: 156117
dataset_size: 323842
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_stsb_come_future | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 12294
num_examples: 54
- name: test
num_bytes: 7266
num_examples: 36
- name: train
num_bytes: 20686
num_examples: 84
download_size: 36824
dataset_size: 40246
---
# Dataset Card for "MULTI_VALUE_stsb_come_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/jinno_megumi_eromangasensei | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Jinno Megumi
This is the dataset of Jinno Megumi, containing 81 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 81 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 196 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 222 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 81 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 81 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 81 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 196 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 196 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 166 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 222 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 222 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
CyberHarem/chloe_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chloe/クロエ (Fire Emblem)
This is the dataset of chloe/クロエ (Fire Emblem), containing 177 images and their tags.
The core tags of this character are `breasts, long_hair, green_eyes, braid, large_breasts, aqua_hair, bangs, earrings, bow, hair_bow, blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 177 | 292.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 177 | 149.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 440 | 327.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 177 | 250.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 440 | 499.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chloe_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, cleavage, elbow_gloves, looking_at_viewer, shoulder_armor, smile, white_gloves, simple_background, solo, upper_body, jewelry, blush, covered_navel, green_hair, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, breastplate, cleavage, elbow_gloves, solo, white_gloves, covered_navel, green_hair, jewelry, looking_at_viewer, open_mouth, shoulder_armor, :d |
| 2 | 9 |  |  |  |  |  | 1girl, elbow_gloves, solo, white_gloves, breastplate, cleavage, looking_at_viewer, smile, jewelry, pegasus_knight_uniform_(fire_emblem), shoulder_armor, holding_polearm, spear, covered_navel |
| 3 | 9 |  |  |  |  |  | 1girl, cleavage, smile, solo, blush, looking_at_viewer, collarbone, necklace, upper_body, green_dress, green_hair, closed_mouth, holding, short_sleeves, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | elbow_gloves | looking_at_viewer | shoulder_armor | smile | white_gloves | simple_background | solo | upper_body | jewelry | blush | covered_navel | green_hair | white_background | breastplate | open_mouth | :d | pegasus_knight_uniform_(fire_emblem) | holding_polearm | spear | collarbone | necklace | green_dress | closed_mouth | holding | short_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:---------------|:--------------------|:-----------------|:--------|:---------------|:--------------------|:-------|:-------------|:----------|:--------|:----------------|:-------------|:-------------------|:--------------|:-------------|:-----|:---------------------------------------|:------------------|:--------|:-------------|:-----------|:--------------|:---------------|:----------|:----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | | X | X | | X | X | X | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | | X | | | X | | | X | X | X | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | | X | | X | | X | X | X | | X | | X | X | | | | | | | X | X | X | X | X | X |
|
aintech/vdf_prefix-cache |
---
tags:
- vdf
- vector-io
- vector-dataset
- vector-embeddings
---
This is a dataset created using [vector-io](https://github.com/ai-northstar-tech/vector-io)
|
okite97/news-data | ---
annotations_creators:
- other
language:
- 'en'
language_creators:
- found
license:
- afl-3.0
multilinguality:
- monolingual
pretty_name: News Dataset
size_categories:
- 1K<n<10K
source_datasets:
- original
tags: []
task_categories:
- text-classification
task_ids:
- topic-classification
- multi-class-classification
---
# Dataset Card for news-data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Dataset Curators](#dataset-curators)
### Dataset Summary
The News Dataset is an English-language dataset containing just over 4k unique news articles scrapped from AriseTv- One of the most popular news television in Nigeria.
### Supported Tasks and Leaderboards
It supports news article classification into different categories.
### Languages
English
## Dataset Structure
### Data Instances
'''
{'Title': 'Nigeria: APC Yet to Zone Party Positions Ahead of Convention'
'Excerpt': 'The leadership of the All Progressives Congress (APC), has denied reports that it had zoned some party positions ahead of'
'Category': 'politics'
'labels': 2}
'''
### Data Fields
* Title: a string containing the title of a news title as shown
* Excerpt: a string containing a short extract from the body of the news
* Category: a string that tells the category of an example (string label)
* labels: integer telling the class of an example (label)
### Data Splits
| Dataset Split | Number of instances in split |
| ----------- | ----------- |
| Train | 4,594 |
| Paragraph | 811 |
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The code for the dataset creation at *https://github.com/chimaobi-okite/NLP-Projects-Competitions/blob/main/NewsCategorization/Data/NewsDataScraping.ipynb*. The examples were scrapped from
<https://www.arise.tv/>
### Annotations
#### Annotation process
The annotation is based on the news category in the [arisetv](https://www.arise.tv) website
#### Who are the annotators?
Journalists at arisetv
## Considerations for Using the Data
### Social Impact of Dataset
The purpose of this dataset is to help develop models that can classify news articles into categories.
This task is useful for efficiently presenting information given a large quantity of text. It should be made clear that any summarizations produced by models trained on this dataset are reflective of the language used in the articles, but are in fact automatically generated.
### Discussion of Biases
This data is biased towards news happenings in Nigeria but the model built using it can as well classify news from other parts of the world
with a slight degradation in performance.
### Dataset Curators
The dataset is created by people at arise but was scrapped by [@github-chimaobi-okite](https://github.com/chimaobi-okite/)
|
pki/SecurityGPT | ---
license: unknown
language:
- en
pretty_name: SecurityGPT
---
Dataset for cybsec research Q&A fine tuning
Initial datasets incorporates results from below;
https://datasetsearch.research.google.com/search?src=0&query=cybersecurity&docid=L2cvMTFuX3hudnBtZw%3D%3D&filters=WyJbXCJsaWNlbnNlX2NsYXNzXCIsW1wiY29tbWVyY2lhbFwiXV0iXQ%3D%3D&property=bGljZW5zZV9jbGFzcw%3D%3D
Training when sufficient amount gathered, as of today prob based on Llama / Orca 8k token at 7b or 13b, decided later.
---
|
joey234/mmlu-security_studies-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 19066
num_examples: 5
- name: test
num_bytes: 7272697
num_examples: 245
download_size: 419870
dataset_size: 7291763
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-security_studies-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HowMannyMore/LLAMA-FineTune-Dataset | ---
dataset_info:
features:
- name: Conversations
dtype: string
- name: Menu
dtype: string
- name: Template
dtype: string
splits:
- name: train
num_bytes: 2032348
num_examples: 1920
- name: valid
num_bytes: 506630
num_examples: 480
download_size: 222741
dataset_size: 2538978
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
|
PartiallyTyped/answerable_tydiqa_tokenized | ---
dataset_info:
features:
- name: language
dtype: string
- name: question
sequence: string
- name: context
sequence: string
- name: references
struct:
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: id
dtype: string
- name: id
dtype: string
- name: labels
dtype: bool
splits:
- name: train
num_bytes: 30320669
num_examples: 29800
- name: validation
num_bytes: 3761508
num_examples: 3709
download_size: 17981416
dataset_size: 34082177
---
# Dataset Card for "answerable_tydiqa_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rlgoff/Blackfeet | ---
license: apache-2.0
---
|
Vijish/mozilla_mongolian4 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: line_id
dtype: string
- name: text
dtype: string
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 1098088046.75
num_examples: 2210
download_size: 959274290
dataset_size: 1098088046.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/laura_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of laura (Fire Emblem)
This is the dataset of laura (Fire Emblem), containing 30 images and their tags.
The core tags of this character are `brown_eyes, short_hair, black_hair, brown_hair, ahoge`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 22.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laura_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 14.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laura_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 21.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laura_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 20.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laura_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 27.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laura_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/laura_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1girl, solo, smile, dress, necklace, open_mouth, staff |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | dress | necklace | open_mouth | staff |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------|:-----------|:-------------|:--------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X |
|
adalib/torchdata-oss | ---
dataset_info:
features:
- name: code
dtype: string
splits:
- name: train
num_bytes: 93482
num_examples: 260
download_size: 32858
dataset_size: 93482
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mjalg/android-14-data | ---
license: afl-3.0
---
|
Broomva/instruct-deduped-spa-guc | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15900746.510253008
num_examples: 69908
download_size: 7087119
dataset_size: 15900746.510253008
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04 | ---
pretty_name: Evaluation run of perlthoughts/Chupacabra-7B-v2.04
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Chupacabra-7B-v2.04](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.04)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T10:12:55.038964](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04/blob/main/results_2024-01-05T10-12-55.038964.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6117535002239652,\n\
\ \"acc_stderr\": 0.03305067073551487,\n \"acc_norm\": 0.6144598700035333,\n\
\ \"acc_norm_stderr\": 0.033716352758886466,\n \"mc1\": 0.5275397796817626,\n\
\ \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6775807253391397,\n\
\ \"mc2_stderr\": 0.014911725947999506\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.0141696645203031,\n\
\ \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902276\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6577375024895439,\n\
\ \"acc_stderr\": 0.004734972668299617,\n \"acc_norm\": 0.8570005974905397,\n\
\ \"acc_norm_stderr\": 0.0034935679140933006\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067887,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5903225806451613,\n\
\ \"acc_stderr\": 0.027976054915347357,\n \"acc_norm\": 0.5903225806451613,\n\
\ \"acc_norm_stderr\": 0.027976054915347357\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306422,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306422\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.0247843169421564,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.0247843169421564\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787586,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787586\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n\
\ \"acc_stderr\": 0.028867431449849313,\n \"acc_norm\": 0.7843137254901961,\n\
\ \"acc_norm_stderr\": 0.028867431449849313\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159253,\n\
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159253\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082393,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082393\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281344,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281344\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n\
\ \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n\
\ \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n\
\ \"acc_stderr\": 0.012704030518851486,\n \"acc_norm\": 0.4491525423728814,\n\
\ \"acc_norm_stderr\": 0.012704030518851486\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066385,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066385\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\
\ \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.6368159203980099,\n\
\ \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5275397796817626,\n\
\ \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6775807253391397,\n\
\ \"mc2_stderr\": 0.014911725947999506\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.01146204641971069\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.514783927217589,\n \
\ \"acc_stderr\": 0.0137664630507876\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Chupacabra-7B-v2.04
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|arc:challenge|25_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|gsm8k|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hellaswag|10_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T10-12-55.038964.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T10-12-55.038964.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- '**/details_harness|winogrande|5_2024-01-05T10-12-55.038964.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T10-12-55.038964.parquet'
- config_name: results
data_files:
- split: 2024_01_05T10_12_55.038964
path:
- results_2024-01-05T10-12-55.038964.parquet
- split: latest
path:
- results_2024-01-05T10-12-55.038964.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.04
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2.04](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.04) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T10:12:55.038964](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04/blob/main/results_2024-01-05T10-12-55.038964.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6117535002239652,
"acc_stderr": 0.03305067073551487,
"acc_norm": 0.6144598700035333,
"acc_norm_stderr": 0.033716352758886466,
"mc1": 0.5275397796817626,
"mc1_stderr": 0.01747693019071219,
"mc2": 0.6775807253391397,
"mc2_stderr": 0.014911725947999506
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.0141696645203031,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902276
},
"harness|hellaswag|10": {
"acc": 0.6577375024895439,
"acc_stderr": 0.004734972668299617,
"acc_norm": 0.8570005974905397,
"acc_norm_stderr": 0.0034935679140933006
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067887,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347357,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347357
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306422,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306422
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.0247843169421564,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.0247843169421564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787586,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787586
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159253,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082393,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082393
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094634,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094634
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729245,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729245
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281344,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281344
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.012704030518851486,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.012704030518851486
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.019291961895066385,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.019291961895066385
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5275397796817626,
"mc1_stderr": 0.01747693019071219,
"mc2": 0.6775807253391397,
"mc2_stderr": 0.014911725947999506
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.01146204641971069
},
"harness|gsm8k|5": {
"acc": 0.514783927217589,
"acc_stderr": 0.0137664630507876
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Crosstyan/BPDataset | ---
license: openrail
tags:
- not-for-all-audiences
size_categories:
- 1K<n<10K
---
For the sake of full disclosure I publish the dataset that I use to train [Crosstyan/BPModel](https://huggingface.co/Crosstyan/BPModel).
NSFW content is contained. Watch with your parents if you don't feel comfortable about that. |
jemale/test | ---
license: mit
---
|
open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties | ---
pretty_name: Evaluation run of brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties](https://huggingface.co/brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-05T03:16:54.690977](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties/blob/main/results_2023-12-05T03-16-54.690977.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7567711901753588,\n\
\ \"acc_stderr\": 0.028382267920122734,\n \"acc_norm\": 0.7615616815437645,\n\
\ \"acc_norm_stderr\": 0.028914131489708655,\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5583921075323958,\n\
\ \"mc2_stderr\": 0.015750345067611658\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n\
\ \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726097\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6693885680143398,\n\
\ \"acc_stderr\": 0.004694718918225748,\n \"acc_norm\": 0.8591913961362279,\n\
\ \"acc_norm_stderr\": 0.0034711315448920457\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9078947368421053,\n \"acc_stderr\": 0.02353268597044349,\n\
\ \"acc_norm\": 0.9078947368421053,\n \"acc_norm_stderr\": 0.02353268597044349\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8301886792452831,\n \"acc_stderr\": 0.02310839379984132,\n\
\ \"acc_norm\": 0.8301886792452831,\n \"acc_norm_stderr\": 0.02310839379984132\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.026280550932848076,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.026280550932848076\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.049512182523962604,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.049512182523962604\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.026947483121496224,\n\
\ \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.026947483121496224\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7517241379310344,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.7517241379310344,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6878306878306878,\n \"acc_stderr\": 0.023865206836972592,\n \"\
acc_norm\": 0.6878306878306878,\n \"acc_norm_stderr\": 0.023865206836972592\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n\
\ \"acc_stderr\": 0.01730838128103453,\n \"acc_norm\": 0.896774193548387,\n\
\ \"acc_norm_stderr\": 0.01730838128103453\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723332,\n \"\
acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723332\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637303,\n\
\ \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637303\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.40370370370370373,\n \"acc_stderr\": 0.029914812342227627,\n \
\ \"acc_norm\": 0.40370370370370373,\n \"acc_norm_stderr\": 0.029914812342227627\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.02273020811930654,\n \
\ \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.02273020811930654\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9229357798165138,\n \"acc_stderr\": 0.011434381698911096,\n \"\
acc_norm\": 0.9229357798165138,\n \"acc_norm_stderr\": 0.011434381698911096\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6296296296296297,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640266,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640266\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445784,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445784\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665168,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665168\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n\
\ \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9106002554278416,\n\
\ \"acc_stderr\": 0.010203017847688303,\n \"acc_norm\": 0.9106002554278416,\n\
\ \"acc_norm_stderr\": 0.010203017847688303\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.020776761102512992,\n\
\ \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.020776761102512992\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7094972067039106,\n\
\ \"acc_stderr\": 0.015183844307206165,\n \"acc_norm\": 0.7094972067039106,\n\
\ \"acc_norm_stderr\": 0.015183844307206165\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213505,\n\
\ \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213505\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n\
\ \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6453900709219859,\n \"acc_stderr\": 0.02853865002887863,\n \
\ \"acc_norm\": 0.6453900709219859,\n \"acc_norm_stderr\": 0.02853865002887863\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5873533246414603,\n\
\ \"acc_stderr\": 0.012573836633799022,\n \"acc_norm\": 0.5873533246414603,\n\
\ \"acc_norm_stderr\": 0.012573836633799022\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02388688192244033,\n\
\ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02388688192244033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8218954248366013,\n \"acc_stderr\": 0.015478369653108566,\n \
\ \"acc_norm\": 0.8218954248366013,\n \"acc_norm_stderr\": 0.015478369653108566\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8530612244897959,\n \"acc_stderr\": 0.022665400417217638,\n\
\ \"acc_norm\": 0.8530612244897959,\n \"acc_norm_stderr\": 0.022665400417217638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n\
\ \"acc_stderr\": 0.020190670535027908,\n \"acc_norm\": 0.9104477611940298,\n\
\ \"acc_norm_stderr\": 0.020190670535027908\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5583921075323958,\n\
\ \"mc2_stderr\": 0.015750345067611658\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363698\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \
\ \"acc_stderr\": 0.013373971277729817\n }\n}\n```"
repo_url: https://huggingface.co/brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|arc:challenge|25_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|gsm8k|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hellaswag|10_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T03-16-54.690977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T03-16-54.690977.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- '**/details_harness|winogrande|5_2023-12-05T03-16-54.690977.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-05T03-16-54.690977.parquet'
- config_name: results
data_files:
- split: 2023_12_05T03_16_54.690977
path:
- results_2023-12-05T03-16-54.690977.parquet
- split: latest
path:
- results_2023-12-05T03-16-54.690977.parquet
---
# Dataset Card for Evaluation run of brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties](https://huggingface.co/brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T03:16:54.690977](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties/blob/main/results_2023-12-05T03-16-54.690977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7567711901753588,
"acc_stderr": 0.028382267920122734,
"acc_norm": 0.7615616815437645,
"acc_norm_stderr": 0.028914131489708655,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5583921075323958,
"mc2_stderr": 0.015750345067611658
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726097
},
"harness|hellaswag|10": {
"acc": 0.6693885680143398,
"acc_stderr": 0.004694718918225748,
"acc_norm": 0.8591913961362279,
"acc_norm_stderr": 0.0034711315448920457
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9078947368421053,
"acc_stderr": 0.02353268597044349,
"acc_norm": 0.9078947368421053,
"acc_norm_stderr": 0.02353268597044349
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8301886792452831,
"acc_stderr": 0.02310839379984132,
"acc_norm": 0.8301886792452831,
"acc_norm_stderr": 0.02310839379984132
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.026280550932848076,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.026280550932848076
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.049512182523962604,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.049512182523962604
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.026947483121496224,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.026947483121496224
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7517241379310344,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.7517241379310344,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6878306878306878,
"acc_stderr": 0.023865206836972592,
"acc_norm": 0.6878306878306878,
"acc_norm_stderr": 0.023865206836972592
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103453,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103453
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723332,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637303,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637303
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.40370370370370373,
"acc_stderr": 0.029914812342227627,
"acc_norm": 0.40370370370370373,
"acc_norm_stderr": 0.029914812342227627
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.02273020811930654,
"acc_norm": 0.8571428571428571,
"acc_norm_stderr": 0.02273020811930654
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9229357798165138,
"acc_stderr": 0.011434381698911096,
"acc_norm": 0.9229357798165138,
"acc_norm_stderr": 0.011434381698911096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640266,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640266
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445784,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445784
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665168,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665168
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9106002554278416,
"acc_stderr": 0.010203017847688303,
"acc_norm": 0.9106002554278416,
"acc_norm_stderr": 0.010203017847688303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8179190751445087,
"acc_stderr": 0.020776761102512992,
"acc_norm": 0.8179190751445087,
"acc_norm_stderr": 0.020776761102512992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7094972067039106,
"acc_stderr": 0.015183844307206165,
"acc_norm": 0.7094972067039106,
"acc_norm_stderr": 0.015183844307206165
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213505,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213505
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6453900709219859,
"acc_stderr": 0.02853865002887863,
"acc_norm": 0.6453900709219859,
"acc_norm_stderr": 0.02853865002887863
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5873533246414603,
"acc_stderr": 0.012573836633799022,
"acc_norm": 0.5873533246414603,
"acc_norm_stderr": 0.012573836633799022
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02388688192244033,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02388688192244033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8218954248366013,
"acc_stderr": 0.015478369653108566,
"acc_norm": 0.8218954248366013,
"acc_norm_stderr": 0.015478369653108566
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8530612244897959,
"acc_stderr": 0.022665400417217638,
"acc_norm": 0.8530612244897959,
"acc_norm_stderr": 0.022665400417217638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.020190670535027908,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.020190670535027908
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5583921075323958,
"mc2_stderr": 0.015750345067611658
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363698
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729817
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FaalSa/f3 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 79710
num_examples: 1
- name: validation
num_bytes: 80190
num_examples: 1
- name: test
num_bytes: 80670
num_examples: 1
download_size: 38187
dataset_size: 240570
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
chenyanjin/legedo-github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: timestamp[s]
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 8768991
num_examples: 2819
download_size: 2134021
dataset_size: 8768991
---
# Dataset Card for "legedo-github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ruhaan04/mini-platypus | ---
dataset_info:
features:
- name: Generated Question
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 11833
num_examples: 25
download_size: 10190
dataset_size: 11833
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlfredHo0830/aivrtesting | ---
license: apache-2.0
---
|
jameskrw/balanced_scikit_adult_census_income | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
tags:
- finance
size_categories:
- 10K<n<100K
---
A balanced version of scikit_adult_census_income. |
MFRocket/MFRPC | ---
task_categories:
- conditional-text-generation
- paraphrase
- gpt-3
- crowdsourced
---
# MF Rocket Paraphrase Corpus (MFRPC) - A State of the Art Paraphrasing Solution
## Dataset Description
MF Rocket Paraphrase Corpus (MFRPC) ) is a corpus consisting of 10,000 sentence pairs. Each sentence pair contains a source sentence and the paraphrased version of the source sentence. The source sentences are created manually and are intended to represent typical sentences found in online articles. They are limited to general topics and are not restricted to a specific domain. The paraphrased sentences were created partly using GPT-3 and partly manually. In this way, we hope to investigate the performance of GPT-3 in a typical real-world setting and improve the quality of the paraphrased sentences through manual corrections.
By finetuning a model we Pegasus with this data, we create a paraphraser that performs very well. The results are indistinguishable from human parahrased sentences in a blind test.
We are currently working on a data set with complete paragraphs or articles.
For more information, our Contact form can be used at https://mf-rocket.de.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "To overcome these difficulties, you must select an activity or goal that you are enthusiastic about [...]",
"target": "To overcome these challenges, you need to find an activity or goal that you are passionate about and[...]"
},
{
"text": "If you are unsure about what to do next, seek advice from a close friend or family member you can tr[...]",
"target": "If you are feeling lost, ask a trusted friend or family member for their opinion about what you shou[...]"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 8000 |
| valid | 2000 |
|
luismond/tm2tb | ---
license: mit
---
|
Praghxx/litlleprag2 | ---
license: openrail
---
|
moriyad/descriptive_contract_smells | ---
license: unknown
---
|
kartik727/Test_Dataset | ---
license: mit
language:
- en
tags:
- vison-language
pretty_name: NLP Project Dataset
size_categories:
- n<1K
--- |
rev0lt0s0/muttyverse | ---
license: eupl-1.1
---
|
open-llm-leaderboard/details_AbacusResearch__haLLAwa2 | ---
pretty_name: Evaluation run of AbacusResearch/haLLAwa2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AbacusResearch/haLLAwa2](https://huggingface.co/AbacusResearch/haLLAwa2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AbacusResearch__haLLAwa2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T13:50:58.490257](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__haLLAwa2/blob/main/results_2024-02-12T13-50-58.490257.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6355767153439188,\n\
\ \"acc_stderr\": 0.032413752856157885,\n \"acc_norm\": 0.6387091168117495,\n\
\ \"acc_norm_stderr\": 0.03305418130027954,\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698303,\n \"mc2\": 0.4737549402479496,\n\
\ \"mc2_stderr\": 0.015584581777910896\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735565,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6836287592113125,\n\
\ \"acc_stderr\": 0.004641092001425291,\n \"acc_norm\": 0.8450507866958773,\n\
\ \"acc_norm_stderr\": 0.003611167302959773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368881,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368881\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n\
\ \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.7419354838709677,\n\
\ \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853035,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853035\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601453,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601453\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001512,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001512\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.016519594275297114,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.016519594275297114\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.01272978538659856,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.01272978538659856\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797157,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797157\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698303,\n \"mc2\": 0.4737549402479496,\n\
\ \"mc2_stderr\": 0.015584581777910896\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011875\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5208491281273692,\n \
\ \"acc_stderr\": 0.013760506094029868\n }\n}\n```"
repo_url: https://huggingface.co/AbacusResearch/haLLAwa2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|arc:challenge|25_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|arc:challenge|25_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|gsm8k|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|gsm8k|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hellaswag|10_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hellaswag|10_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-39-22.814188.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-50-58.490257.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T13-50-58.490257.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- '**/details_harness|winogrande|5_2024-02-12T13-39-22.814188.parquet'
- split: 2024_02_12T13_50_58.490257
path:
- '**/details_harness|winogrande|5_2024-02-12T13-50-58.490257.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T13-50-58.490257.parquet'
- config_name: results
data_files:
- split: 2024_02_12T13_39_22.814188
path:
- results_2024-02-12T13-39-22.814188.parquet
- split: 2024_02_12T13_50_58.490257
path:
- results_2024-02-12T13-50-58.490257.parquet
- split: latest
path:
- results_2024-02-12T13-50-58.490257.parquet
---
# Dataset Card for Evaluation run of AbacusResearch/haLLAwa2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AbacusResearch/haLLAwa2](https://huggingface.co/AbacusResearch/haLLAwa2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AbacusResearch__haLLAwa2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T13:50:58.490257](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__haLLAwa2/blob/main/results_2024-02-12T13-50-58.490257.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6355767153439188,
"acc_stderr": 0.032413752856157885,
"acc_norm": 0.6387091168117495,
"acc_norm_stderr": 0.03305418130027954,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698303,
"mc2": 0.4737549402479496,
"mc2_stderr": 0.015584581777910896
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735565,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.6836287592113125,
"acc_stderr": 0.004641092001425291,
"acc_norm": 0.8450507866958773,
"acc_norm_stderr": 0.003611167302959773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368881,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368881
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.01606005626853035,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.01606005626853035
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601453,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601453
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001512,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001512
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297114,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297114
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.01272978538659856,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.01272978538659856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.019333142020797157,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.019333142020797157
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698303,
"mc2": 0.4737549402479496,
"mc2_stderr": 0.015584581777910896
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011875
},
"harness|gsm8k|5": {
"acc": 0.5208491281273692,
"acc_stderr": 0.013760506094029868
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AtAndDev/ShareGPT-Vicuna-v3-cleaned-unfiltered | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1211675
num_examples: 145
download_size: 0
dataset_size: 1211675
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ShareGPT-Vicuna-v3-cleaned-unfiltered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazi-ali/llama_2-optimized-titles-esci-sft-test | ---
dataset_info:
features:
- name: index
dtype: int64
- name: product_title
dtype: string
- name: text
dtype: string
- name: preds
dtype: string
- name: clean_preds
dtype: string
- name: average_score
dtype: float64
- name: new_score
dtype: float64
- name: good_pred
dtype: string
- name: bad_pred
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3059731.0
num_examples: 2321
download_size: 1697427
dataset_size: 3059731.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2-optimized-titles-esci-sft-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
strombergnlp/nlpcc-stance | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- zh
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-analysis
pretty_name: NLPCC Stance
tags:
- stance-detection
---
# Dataset Card for "NLPCC 2016: Stance Detection in Chinese Microblogs"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://tcci.ccf.org.cn/conference/2016/pages/page05_evadata.html](http://tcci.ccf.org.cn/conference/2016/pages/page05_evadata.html)
- **Repository:**
- **Paper:** [https://link.springer.com/chapter/10.1007/978-3-319-50496-4_85](https://link.springer.com/chapter/10.1007/978-3-319-50496-4_85)
- **Point of Contact:** [Mads Kongsback](https://github.com/mkonxd)
- **Size of downloaded dataset files:**
- **Size of the generated dataset:**
- **Total amount of disk used:**
### Dataset Summary
This is a stance prediction dataset in Chinese.
The data is that from a shared task, stance detection in Chinese microblogs, in NLPCC-ICCPOL 2016. It covers Task A, a mandatory supervised task which detects stance towards five targets of interest with given labeled data.
Some instances of the dataset have been removed, as they were without label.
### Supported Tasks and Leaderboards
* Stance Detection in Chinese Microblogs
### Languages
Chinese, as spoken on the Weibo website (`bcp47:zh`)
## Dataset Structure
### Data Instances
Example instance:
```
{
'id': '0',
'target': 'IphoneSE',
'text': '3月31日,苹果iPhone SE正式开卖,然而这款小屏新机并未出现人们预想的疯抢局面。根据市场分析机构Localytics周一公布的数据,iPhone SE正式上市的这个周末,销量成绩并不算太好。',
'stance': 2
}
```
### Data Fields
* id: a `string` field with a unique id for the instance
* target: a `string` representing the target of the stance
* text: a `string` of the stance-bearing text
* stance: an `int` representing class label -- `0`: AGAINST; `1`: FAVOR; `2`: NONE.
### Data Splits
The training split has 2986 instances
## Dataset Creation
### Curation Rationale
The goal was to create a dataset of microblog text annotated for stance. Six stance targets were selected and data was collected from Sina Weibo for annotation.
### Source Data
#### Initial Data Collection and Normalization
Not specified
#### Who are the source language producers?
Sina Weibo users
### Annotations
#### Annotation process
The stance of each target-microblog pair is duplicated annotated by two students
individually. If these two students provide the same annotation, the stance of this
microblog-target pair is then labeled. If the different annotation is detected, the third
student will be assigned to annotate this pair. Their annotation results will be voted to
obtain the final label.
#### Who are the annotators?
Students in China
### Personal and Sensitive Information
No reflections
## Considerations for Using the Data
### Social Impact of Dataset
The data preserves social media utterances verbatim and so has obviated any right to be forgotten, though usernames and post IDs are not explicitly included in the data.
### Discussion of Biases
There'll be at least a temporal and regional bias to this data, as well as it only representing expressions of stance on six topics.
### Other Known Limitations
## Additional Information
### Dataset Curators
The dataset is curated by the paper's authors.
### Licensing Information
The authors distribute this data under Creative Commons attribution license, CC-BY 4.0.
### Citation Information
```
@incollection{xu2016overview,
title={Overview of nlpcc shared task 4: Stance detection in chinese microblogs},
author={Xu, Ruifeng and Zhou, Yu and Wu, Dongyin and Gui, Lin and Du, Jiachen and Xue, Yun},
booktitle={Natural language understanding and intelligent applications},
pages={907--916},
year={2016},
publisher={Springer}
}
```
### Contributions
Added by [@mkonxd](https://github.com/mkonxd), [@leondz](https://github.com/leondz)
|
AlekseyKorshuk/dalio-handwritten-complete | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 11957
num_examples: 10
- name: train
num_bytes: 80837
num_examples: 55
- name: validation
num_bytes: 13340
num_examples: 10
download_size: 79024
dataset_size: 106134
---
# Dataset Card for "dalio-handwritten-complete"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/37dd4157 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1336
dataset_size: 188
---
# Dataset Card for "37dd4157"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-autoevaluate__squad-sample-autoevaluate__squad-sample-778ba0-17436362 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- autoevaluate/squad-sample
eval_info:
task: extractive_question_answering
model: autoevaluate/extractive-question-answering
metrics: []
dataset_name: autoevaluate/squad-sample
dataset_config: autoevaluate--squad-sample
dataset_split: test
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: autoevaluate/extractive-question-answering
* Dataset: autoevaluate/squad-sample
* Config: autoevaluate--squad-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Jonga5426/Jonga | ---
license: other
---
|
zhengr/ultrachat_200k | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
task_categories:
- conversational
- text-generation
pretty_name: UltraChat 200k
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 1397058554
num_examples: 207865
- name: test_sft
num_bytes: 154695659
num_examples: 23110
- name: train_gen
num_bytes: 1347396812
num_examples: 256032
- name: test_gen
num_bytes: 148276089
num_examples: 28304
download_size: 1624049723
dataset_size: 3047427114
---
# Dataset Card for UltraChat 200k
## Dataset Description
This is a heavily filtered version of the [UltraChat](https://github.com/thunlp/UltraChat) dataset and was used to train [Zephyr-7B-β](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta), a state of the art 7b chat model.
The original datasets consists of 1.4M dialogues generated by ChatGPT and spanning a wide range of topics. To create `UltraChat 200k`, we applied the following logic:
- Selection of a subset of data for faster supervised fine tuning.
- Truecasing of the dataset, as we observed around 5% of the data contained grammatical errors like "Hello. how are you?" instead of "Hello. How are you?"
- Removal of dialogues where the assistant replies with phrases like "I do not have emotions" or "I don't have opinions", even for fact-based prompts that don't involve either.
## Dataset Structure
The dataset has four splits, suitable for:
* Supervised fine-tuning (`sft`).
* Generation ranking (`gen`) via techniques like rejection sampling or PPO.
The number of examples per split is shown as follows:
| train_sft | test_sft | train_gen | test_gen |
|:-------:|:-----------:|:-----:| :-----:|
| 207865 | 23110 | 256032 | 28304 |
The dataset is stored in parquet format with each entry using the following schema:
```
{
"prompt": "Create a fully-developed protagonist who is challenged to survive within a dystopian society under the rule of a tyrant. ...",
"messages":[
{
"content": "Create a fully-developed protagonist who is challenged to survive within a dystopian society under the rule of a tyrant. ...",
"role": "user"
},
{
"content": "Name: Ava\n\n Ava was just 16 years old when the world as she knew it came crashing down. The government had collapsed, leaving behind a chaotic and lawless society. ...",
"role": "assistant"
},
{
"content": "Wow, Ava's story is so intense and inspiring! Can you provide me with more details. ...",
"role": "user"
},
{
"content": "Certainly! ....",
"role": "assistant"
},
{
"content": "That's really interesting! I would love to hear more...",
"role": "user"
}
{
"content": "Certainly! ....",
"role": "assistant"
},
],
"prompt_id": "d938b65dfe31f05f80eb8572964c6673eddbd68eff3db6bd234d7f1e3b86c2af"
}
```
## Citation
If you find this dataset is useful in your work, please cite the original UltraChat dataset:
```
@misc{ding2023enhancing,
title={Enhancing Chat Language Models by Scaling High-quality Instructional Conversations},
author={Ning Ding and Yulin Chen and Bokai Xu and Yujia Qin and Zhi Zheng and Shengding Hu and Zhiyuan Liu and Maosong Sun and Bowen Zhou},
year={2023},
eprint={2305.14233},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
You may also wish to cite the Zephyr 7B technical report:
```
@misc{tunstall2023zephyr,
title={Zephyr: Direct Distillation of LM Alignment},
author={Lewis Tunstall and Edward Beeching and Nathan Lambert and Nazneen Rajani and Kashif Rasul and Younes Belkada and Shengyi Huang and Leandro von Werra and Clémentine Fourrier and Nathan Habib and Nathan Sarrazin and Omar Sanseviero and Alexander M. Rush and Thomas Wolf},
year={2023},
eprint={2310.16944},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
patimus-prime/smiles_L1_target | ---
license: mit
---
|
wiserifle/data-grabber-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 560624
num_examples: 1000
download_size: 113546
dataset_size: 560624
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/areyoutheonlyonewholovesme | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Are You The Only One Who Loves Me?
This is the image base of bangumi Are you the only one who loves me?, we detected 77 characters, 8518 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 2569 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 42 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 75 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 16 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 29 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 102 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 594 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 94 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 30 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 18 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 35 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 29 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 161 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 27 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 19 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 33 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 571 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 33 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 141 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 130 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 17 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 18 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 9 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 19 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 13 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 38 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 22 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 25 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 14 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 545 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 9 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 15 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 90 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 9 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 199 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 36 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 11 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 21 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 7 | [Download](38/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 39 | 6 | [Download](39/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 40 | 40 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 17 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 267 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 40 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 13 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 9 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 12 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 594 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 75 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 28 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 12 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 10 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 18 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 7 | [Download](53/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 54 | 19 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 9 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 6 | [Download](56/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 57 | 11 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 5 | [Download](58/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 59 | 184 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 794 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 58 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 44 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 8 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 11 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 12 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 12 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 20 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 148 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 12 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 6 | [Download](70/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 71 | 6 | [Download](71/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 72 | 11 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 6 | [Download](73/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 74 | 31 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 23 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 69 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-4h | ---
pretty_name: Evaluation run of SC56/Mistral-7B-sumz-dpo-4h
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SC56/Mistral-7B-sumz-dpo-4h](https://huggingface.co/SC56/Mistral-7B-sumz-dpo-4h)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-4h\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T02:25:30.764321](https://huggingface.co/datasets/open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-4h/blob/main/results_2024-01-28T02-25-30.764321.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6540042781534384,\n\
\ \"acc_stderr\": 0.032119400147504445,\n \"acc_norm\": 0.6534147738972117,\n\
\ \"acc_norm_stderr\": 0.03279056329960576,\n \"mc1\": 0.5679314565483476,\n\
\ \"mc1_stderr\": 0.01734120239498833,\n \"mc2\": 0.7173857907241913,\n\
\ \"mc2_stderr\": 0.014780138265240631\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403511,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7171878111929895,\n\
\ \"acc_stderr\": 0.0044944549118446225,\n \"acc_norm\": 0.888070105556662,\n\
\ \"acc_norm_stderr\": 0.0031463583832603585\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\"\
: 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903338,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903338\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n\
\ \"acc_stderr\": 0.016501579306861677,\n \"acc_norm\": 0.41899441340782123,\n\
\ \"acc_norm_stderr\": 0.016501579306861677\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.01275015180292244,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.01275015180292244\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5679314565483476,\n\
\ \"mc1_stderr\": 0.01734120239498833,\n \"mc2\": 0.7173857907241913,\n\
\ \"mc2_stderr\": 0.014780138265240631\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.01030920949818748\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.012679297549515432\n }\n}\n```"
repo_url: https://huggingface.co/SC56/Mistral-7B-sumz-dpo-4h
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|arc:challenge|25_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|gsm8k|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hellaswag|10_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T02-25-30.764321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T02-25-30.764321.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- '**/details_harness|winogrande|5_2024-01-28T02-25-30.764321.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T02-25-30.764321.parquet'
- config_name: results
data_files:
- split: 2024_01_28T02_25_30.764321
path:
- results_2024-01-28T02-25-30.764321.parquet
- split: latest
path:
- results_2024-01-28T02-25-30.764321.parquet
---
# Dataset Card for Evaluation run of SC56/Mistral-7B-sumz-dpo-4h
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SC56/Mistral-7B-sumz-dpo-4h](https://huggingface.co/SC56/Mistral-7B-sumz-dpo-4h) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-4h",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T02:25:30.764321](https://huggingface.co/datasets/open-llm-leaderboard/details_SC56__Mistral-7B-sumz-dpo-4h/blob/main/results_2024-01-28T02-25-30.764321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6540042781534384,
"acc_stderr": 0.032119400147504445,
"acc_norm": 0.6534147738972117,
"acc_norm_stderr": 0.03279056329960576,
"mc1": 0.5679314565483476,
"mc1_stderr": 0.01734120239498833,
"mc2": 0.7173857907241913,
"mc2_stderr": 0.014780138265240631
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403511,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7171878111929895,
"acc_stderr": 0.0044944549118446225,
"acc_norm": 0.888070105556662,
"acc_norm_stderr": 0.0031463583832603585
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903338,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41899441340782123,
"acc_stderr": 0.016501579306861677,
"acc_norm": 0.41899441340782123,
"acc_norm_stderr": 0.016501579306861677
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.01275015180292244,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.01275015180292244
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5679314565483476,
"mc1_stderr": 0.01734120239498833,
"mc2": 0.7173857907241913,
"mc2_stderr": 0.014780138265240631
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.01030920949818748
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515432
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subashs/trap | ---
license: apache-2.0
---
|
AdapterOcean/med_alpaca_standardized_cluster_37_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 17005781
num_examples: 32304
download_size: 8958747
dataset_size: 17005781
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_37_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_223 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1248220436.0
num_examples: 243223
download_size: 1279560957
dataset_size: 1248220436.0
---
# Dataset Card for "chunk_223"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SocialGrep/one-million-reddit-jokes | ---
annotations_creators:
- lexyr
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
paperswithcode_id: null
---
# Dataset Card for one-million-reddit-jokes
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://socialgrep.com/datasets](https://socialgrep.com/datasets?utm_source=huggingface&utm_medium=link&utm_campaign=onemillionjokes)
- **Point of Contact:** [Website](https://socialgrep.com/contact?utm_source=huggingface&utm_medium=link&utm_campaign=onemillionjokes)
### Dataset Summary
This corpus contains a million posts from /r/jokes.
Posts are annotated with their score.
### Languages
Mainly English.
## Dataset Structure
### Data Instances
A data point is a Reddit post.
### Data Fields
- 'type': the type of the data point. Can be 'post' or 'comment'.
- 'id': the base-36 Reddit ID of the data point. Unique when combined with type.
- 'subreddit.id': the base-36 Reddit ID of the data point's host subreddit. Unique.
- 'subreddit.name': the human-readable name of the data point's host subreddit.
- 'subreddit.nsfw': a boolean marking the data point's host subreddit as NSFW or not.
- 'created_utc': a UTC timestamp for the data point.
- 'permalink': a reference link to the data point on Reddit.
- 'score': score of the data point on Reddit.
- 'domain': the domain of the data point's link.
- 'url': the destination of the data point's link, if any.
- 'selftext': the self-text of the data point, if any.
- 'title': the title of the post data point.
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
CC-BY v4.0
### Contributions
[Needs More Information] |
Francesco/bccd-ouzjz | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': bccd
'1': Platelets
'2': RBC
'3': WBC
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: bccd-ouzjz
tags:
- rf100
---
# Dataset Card for bccd-ouzjz
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/bccd-ouzjz
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
bccd-ouzjz
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/bccd-ouzjz
### Citation Information
```
@misc{ bccd-ouzjz,
title = { bccd ouzjz Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/bccd-ouzjz } },
url = { https://universe.roboflow.com/object-detection/bccd-ouzjz },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
NPrashanthReddy/book-embeddings | ---
license: mit
---
|
thobauma/harmless-poisoned-0.05-SUDO-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Multimodal-Fatima/OxfordPets_test_facebook_opt_1.3b_Visclues_ns_3669_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_5_bs_16
num_bytes: 128177014.375
num_examples: 3669
- name: fewshot_1_bs_16
num_bytes: 122802439.375
num_examples: 3669
- name: fewshot_3_bs_16
num_bytes: 125493335.375
num_examples: 3669
download_size: 364477060
dataset_size: 376472789.125
---
# Dataset Card for "OxfordPets_test_facebook_opt_1.3b_Visclues_ns_3669_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MagicHub/railway-knowledge-QA | ---
license: cc-by-4.0
---
|
martino-canavate/50-pythonclean-dataset | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: path
dtype: string
- name: copies
dtype: string
- name: size
dtype: string
- name: content
dtype: string
- name: license
dtype: string
- name: hash
dtype: int64
- name: line_mean
dtype: float64
- name: line_max
dtype: int64
- name: alpha_frac
dtype: float64
- name: autogenerated
dtype: bool
splits:
- name: train
num_bytes: 54587956838
num_examples: 5361373
download_size: 19727904629
dataset_size: 54587956838
---
# Dataset Card for "50-pythonclean-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_71 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1190239100
num_examples: 231925
download_size: 1214687215
dataset_size: 1190239100
---
# Dataset Card for "chunk_71"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chenghao/NEWS-COPY-eval | ---
dataset_info:
features:
- name: image_file_name
dtype: string
- name: image_path
dtype: string
- name: object_id
sequence: int64
- name: headline
dtype: string
- name: article
dtype: string
- name: byline
dtype: string
- name: bbox_list
sequence:
sequence: float64
- name: bbox
sequence: float64
- name: full_article_id
dtype: int64
- name: id
dtype: string
- name: imageid
dtype: int64
- name: query
dtype: string
- name: idx
dtype: int64
- name: cluster
dtype: int64
- name: duplicates
sequence: int64
splits:
- name: test
num_bytes: 23946859
num_examples: 14211
- name: val
num_bytes: 8647243
num_examples: 4988
download_size: 19407100
dataset_size: 32594102
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: val
path: data/val-*
license: unknown
---
# NEWS COPY
This dataset contains the evaluation and test sets for the NEWS COPY dataset. Original source can be found at [Github](https://github.com/dell-research-harvard/NEWS-COPY). The license is unclear.
It contains the following data:
- Historical Newspapers
Training datasets can be found at [chenghao/NEWS-COPY-train](https://huggingface.co/datasets/chenghao/NEWS-COPY-train/).
## Citation
```
@inproceedings{silcock-etal-2020-noise,
title = "Noise-Robust De-Duplication at Scale",
author = "Silcock, Emily and D'Amico-Wong, Luca and Yang, Jinglin and Dell, Melissa",
booktitle = "International Conference on Learning Representations (ICLR)",
year = "2023",
}
``` |
gabrielmbmb/wikipedia_es_genstruct_v2 | ---
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
- name: messages
sequence: 'null'
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: conversation
sequence:
sequence: string
splits:
- name: train
num_bytes: 1508645
num_examples: 500
download_size: 812759
dataset_size: 1508645
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ines_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ines/イネス/伊内丝 (Arknights)
This is the dataset of ines/イネス/伊内丝 (Arknights), containing 149 images and their tags.
The core tags of this character are `black_hair, horns, long_hair, yellow_eyes, breasts, demon_horns, very_long_hair, parted_bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 149 | 301.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ines_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 149 | 242.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ines_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 371 | 463.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ines_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ines_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, long_sleeves, belt, detached_sleeves, pouch, black_shirt, simple_background, black_skirt, closed_mouth, black_dress, cowboy_shot, sleeveless, white_background, holding, weapon, hand_up, string |
| 1 | 5 |  |  |  |  |  | 1girl, belt, black_skirt, closed_mouth, long_sleeves, pouch, solo, bare_shoulders, black_footwear, black_shirt, knee_boots, looking_at_viewer, thighs, black_dress, full_body, medium_breasts, miniskirt, single_knee_pad, standing, thigh_strap, black_nails, holding_dagger, knee_pads, mole, pencil_skirt, shoulder_cutout, sword |
| 2 | 6 |  |  |  |  |  | 1girl, blush, mosaic_censoring, nipples, looking_at_viewer, open_mouth, sweat, cum_in_pussy, hetero, solo_focus, 1boy, after_sex, bare_shoulders, black_footwear, black_nails, boots, cumdrip, nail_polish, symbol-shaped_pupils, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | looking_at_viewer | solo | long_sleeves | belt | detached_sleeves | pouch | black_shirt | simple_background | black_skirt | closed_mouth | black_dress | cowboy_shot | sleeveless | white_background | holding | weapon | hand_up | string | black_footwear | knee_boots | thighs | full_body | medium_breasts | miniskirt | single_knee_pad | standing | thigh_strap | black_nails | holding_dagger | knee_pads | mole | pencil_skirt | shoulder_cutout | sword | blush | mosaic_censoring | nipples | open_mouth | sweat | cum_in_pussy | hetero | solo_focus | 1boy | after_sex | boots | cumdrip | nail_polish | symbol-shaped_pupils | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------------|:-------|:---------------|:-------|:-------------------|:--------|:--------------|:--------------------|:--------------|:---------------|:--------------|:--------------|:-------------|:-------------------|:----------|:---------|:----------|:---------|:-----------------|:-------------|:---------|:------------|:-----------------|:------------|:------------------|:-----------|:--------------|:--------------|:-----------------|:------------|:-------|:---------------|:------------------|:--------|:--------|:-------------------|:----------|:-------------|:--------|:---------------|:---------|:-------------|:-------|:------------|:--------|:----------|:--------------|:-----------------------|:----------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
autoevaluate/autoeval-staging-eval-project-6e6ed30f-40d7-4939-99af-0ba4041a05ee-6559 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
hemangjoshi37a/autotrain-data-ratnakar_1000_sample_curated | ---
language:
- en
---
# AutoTrain Dataset for project: ratnakar_1000_sample_curated
## Dataset Description
This dataset has been automatically processed by AutoTrain for project ratnakar_1000_sample_curated.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"tokens": [
"INTRADAY",
"NAHARINDUS",
" ABOVE ",
"128",
" - 129 SL ",
"126",
" TARGET ",
"140",
" "
],
"tags": [
8,
10,
0,
3,
0,
9,
0,
5,
0
]
},
{
"tokens": [
"INTRADAY",
"ASTRON",
" ABV ",
"39",
" SL ",
"37.50",
" TARGET ",
"45",
" "
],
"tags": [
8,
10,
0,
3,
0,
9,
0,
5,
0
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"tags": "Sequence(feature=ClassLabel(num_classes=12, names=['NANA', 'btst', 'delivery', 'enter', 'entry_momentum', 'exit', 'exit2', 'exit3', 'intraday', 'sl', 'symbol', 'touched'], id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 726 |
| valid | 259 |
# GitHub Link to this project : [Telegram Trade Msg Backtest ML](https://github.com/hemangjoshi37a/TelegramTradeMsgBacktestML)
# Need custom model for your application? : Place a order on hjLabs.in : [Custom Token Classification or Named Entity Recognition (NER) model as in Natural Language Processing (NLP) Machine Learning](https://hjlabs.in/product/custom-token-classification-or-named-entity-recognition-ner-model-as-in-natural-language-processing-nlp-machine-learning/)
## What this repository contains? :
1. Label data using LabelStudio NER(Named Entity Recognition or Token Classification) tool.
 convert to 
2. Convert LabelStudio CSV or JSON to HuggingFace-autoTrain dataset conversion script

3. Train NER model on Hugginface-autoTrain.

4. Use Hugginface-autoTrain model to predict labels on new data in LabelStudio using LabelStudio-ML-Backend.



5. Define python function to predict labels using Hugginface-autoTrain model.


6. Only label new data from newly predicted-labels-dataset that has falsified labels.

7. Backtest Truely labelled dataset against real historical data of the stock using zerodha kiteconnect and jugaad_trader.

8. Evaluate total gained percentage since inception summation-wise and compounded and plot.

9. Listen to telegram channel for new LIVE messages using telegram API for algotrading.

10. Serve the app as flask web API for web request and respond to it as labelled tokens.

11. Outperforming or underperforming results of the telegram channel tips against exchange index by percentage.

Place a custom order on hjLabs.in : [https://hjLabs.in](https://hjlabs.in/?product=custom-algotrading-software-for-zerodha-and-angel-w-source-code)
----------------------------------------------------------------------
### Contact us
Mobile : [+917016525813](tel:+917016525813)
Whatsapp & Telegram : [+919409077371](tel:+919409077371)
Email : [hemangjoshi37a@gmail.com](mailto:hemangjoshi37a@gmail.com)
Place a custom order on hjLabs.in : [https://hjLabs.in](https://hjlabs.in/)
Please contribute your suggestions and corections to support our efforts.
Thank you.
Buy us a coffee for $5 on PayPal ?
[](https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=5JXC8VRCSUZWJ)
----------------------------------------------------------------------
### Checkout Our Other Repositories
- [pyPortMan](https://github.com/hemangjoshi37a/pyPortMan)
- [transformers_stock_prediction](https://github.com/hemangjoshi37a/transformers_stock_prediction)
- [TrendMaster](https://github.com/hemangjoshi37a/TrendMaster)
- [hjAlgos_notebooks](https://github.com/hemangjoshi37a/hjAlgos_notebooks)
- [AutoCut](https://github.com/hemangjoshi37a/AutoCut)
- [My_Projects](https://github.com/hemangjoshi37a/My_Projects)
- [Cool Arduino and ESP8266 or NodeMCU Projects](https://github.com/hemangjoshi37a/my_Arduino)
- [Telegram Trade Msg Backtest ML](https://github.com/hemangjoshi37a/TelegramTradeMsgBacktestML)
### Checkout Our Other Products
- [WiFi IoT LED Matrix Display](https://hjlabs.in/product/wifi-iot-led-display)
- [SWiBoard WiFi Switch Board IoT Device](https://hjlabs.in/product/swiboard-wifi-switch-board-iot-device)
- [Electric Bicycle](https://hjlabs.in/product/electric-bicycle)
- [Product 3D Design Service with Solidworks](https://hjlabs.in/product/product-3d-design-with-solidworks/)
- [AutoCut : Automatic Wire Cutter Machine](https://hjlabs.in/product/automatic-wire-cutter-machine/)
- [Custom AlgoTrading Software Coding Services](https://hjlabs.in/product/custom-algotrading-software-for-zerodha-and-angel-w-source-code//)
- [SWiBoard :Tasmota MQTT Control App](https://play.google.com/store/apps/details?id=in.hjlabs.swiboard)
- [Custom Token Classification or Named Entity Recognition (NER) model as in Natural Language Processing (NLP) Machine Learning](https://hjlabs.in/product/custom-token-classification-or-named-entity-recognition-ner-model-as-in-natural-language-processing-nlp-machine-learning/)
## Some Cool Arduino and ESP8266 (or NodeMCU) IoT projects:
- [IoT_LED_over_ESP8266_NodeMCU : Turn LED on and off using web server hosted on a nodemcu or esp8266](https://github.com/hemangjoshi37a/my_Arduino/tree/master/IoT_LED_over_ESP8266_NodeMCU)
- [ESP8266_NodeMCU_BasicOTA : Simple OTA (Over The Air) upload code from Arduino IDE using WiFi to NodeMCU or ESP8266](https://github.com/hemangjoshi37a/my_Arduino/tree/master/ESP8266_NodeMCU_BasicOTA)
- [IoT_CSV_SD : Read analog value of Voltage and Current and write it to SD Card in CSV format for Arduino, ESP8266, NodeMCU etc](https://github.com/hemangjoshi37a/my_Arduino/tree/master/IoT_CSV_SD)
- [Honeywell_I2C_Datalogger : Log data in A SD Card from a Honeywell I2C HIH8000 or HIH6000 series sensor having external I2C RTC clock](https://github.com/hemangjoshi37a/my_Arduino/tree/master/Honeywell_I2C_Datalogger)
- [IoT_Load_Cell_using_ESP8266_NodeMC : Read ADC value from High Precision 12bit ADS1015 ADC Sensor and Display on SSD1306 SPI Display as progress bar for Arduino or ESP8266 or NodeMCU](https://github.com/hemangjoshi37a/my_Arduino/tree/master/IoT_Load_Cell_using_ESP8266_NodeMC)
- [IoT_SSD1306_ESP8266_NodeMCU : Read from High Precision 12bit ADC seonsor ADS1015 and display to SSD1306 SPI as progress bar in ESP8266 or NodeMCU or Arduino](https://github.com/hemangjoshi37a/my_Arduino/tree/master/IoT_SSD1306_ESP8266_NodeMCU)
## Checkout Our Awesome 3D GrabCAD Models:
- [AutoCut : Automatic Wire Cutter Machine](https://grabcad.com/library/automatic-wire-cutter-machine-1)
- [ESP Matrix Display 5mm Acrylic Box](https://grabcad.com/library/esp-matrix-display-5mm-acrylic-box-1)
- [Arcylic Bending Machine w/ Hot Air Gun](https://grabcad.com/library/arcylic-bending-machine-w-hot-air-gun-1)
- [Automatic Wire Cutter/Stripper](https://grabcad.com/library/automatic-wire-cutter-stripper-1)
## Our HuggingFace Models :
- [hemangjoshi37a/autotrain-ratnakar_1000_sample_curated-1474454086 : Stock tip message NER(Named Entity Recognition or Token Classification) using HUggingFace-AutoTrain and LabelStudio and Ratnakar Securities Pvt. Ltd.](https://huggingface.co/hemangjoshi37a/autotrain-ratnakar_1000_sample_curated-1474454086)
## Our HuggingFace Datasets :
- [hemangjoshi37a/autotrain-data-ratnakar_1000_sample_curated : Stock tip message NER(Named Entity Recognition or Token Classification) using HUggingFace-AutoTrain and LabelStudio and Ratnakar Securities Pvt. Ltd.](https://huggingface.co/datasets/hemangjoshi37a/autotrain-data-ratnakar_1000_sample_curated)
## We sell Gigs on Fiverr :
- [code android and ios app for you using flutter firebase software stack](https://business.fiverr.com/share/3v14pr)
- [code custom algotrading software for zerodha or angel broking](https://business.fiverr.com/share/kzkvEy)
## Awesome Fiverr. Gigs:
- [develop machine learning ner model as in nlp using python](https://www.fiverr.com/share/9YNabx)
- [train custom chatgpt question answering model](https://www.fiverr.com/share/rwx6r7)
- [build algotrading, backtesting and stock monitoring tools using python](https://www.fiverr.com/share/A7Y14q)
- [tutor you in your science problems](https://www.fiverr.com/share/zPzmlz)
- [make apps for you crossplatform ](https://www.fiverr.com/share/BGw12l)
|
RIW/small_coco_test_1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: url
dtype: string
- name: key
dtype: string
- name: status
dtype: string
- name: error_message
dtype: 'null'
- name: width
dtype: int64
- name: height
dtype: int64
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: exif
dtype: string
- name: sha256
dtype: string
- name: watermark
dtype: bool
splits:
- name: train
num_bytes: 816214224.2
num_examples: 9950
- name: validation
num_bytes: 885003521.915
num_examples: 8965
download_size: 362870789
dataset_size: 1701217746.115
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
deokhk/fi_wiki_sentences_1000000 | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 108630760
num_examples: 1000000
- name: dev
num_bytes: 106924
num_examples: 1000
download_size: 70107634
dataset_size: 108737684
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
---
|
AleBAKA/Tomos1 | ---
license: creativeml-openrail-m
---
|
tulip4attoo/qa_pairs_2nd | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 19245176
num_examples: 51796
download_size: 12215984
dataset_size: 19245176
---
# Dataset Card for "qa_pairs_2nd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nma/resume_dataset_train | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2856338396
num_examples: 428365
download_size: 828086360
dataset_size: 2856338396
---
# Dataset Card for "resume_dataset_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA | ---
pretty_name: Evaluation run of fangloveskari/ORCA_LLaMA_70B_QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fangloveskari/ORCA_LLaMA_70B_QLoRA](https://huggingface.co/fangloveskari/ORCA_LLaMA_70B_QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T16:47:31.229796](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA/blob/main/results_2023-09-23T16-47-31.229796.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3109270134228188,\n\
\ \"em_stderr\": 0.004740252668251192,\n \"f1\": 0.47044567953020594,\n\
\ \"f1_stderr\": 0.004325159736671571,\n \"acc\": 0.5600850420632693,\n\
\ \"acc_stderr\": 0.011402883443890944\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3109270134228188,\n \"em_stderr\": 0.004740252668251192,\n\
\ \"f1\": 0.47044567953020594,\n \"f1_stderr\": 0.004325159736671571\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2835481425322214,\n \
\ \"acc_stderr\": 0.012415070917508125\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n\
\ }\n}\n```"
repo_url: https://huggingface.co/fangloveskari/ORCA_LLaMA_70B_QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|arc:challenge|25_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T16_47_31.229796
path:
- '**/details_harness|drop|3_2023-09-23T16-47-31.229796.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T16-47-31.229796.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T16_47_31.229796
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-47-31.229796.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-47-31.229796.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hellaswag|10_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T16_47_31.229796
path:
- '**/details_harness|winogrande|5_2023-09-23T16-47-31.229796.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T16-47-31.229796.parquet'
- config_name: results
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- results_2023-08-29T08:51:06.198415.parquet
- split: 2023_09_23T16_47_31.229796
path:
- results_2023-09-23T16-47-31.229796.parquet
- split: latest
path:
- results_2023-09-23T16-47-31.229796.parquet
---
# Dataset Card for Evaluation run of fangloveskari/ORCA_LLaMA_70B_QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/fangloveskari/ORCA_LLaMA_70B_QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [fangloveskari/ORCA_LLaMA_70B_QLoRA](https://huggingface.co/fangloveskari/ORCA_LLaMA_70B_QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T16:47:31.229796](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA/blob/main/results_2023-09-23T16-47-31.229796.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3109270134228188,
"em_stderr": 0.004740252668251192,
"f1": 0.47044567953020594,
"f1_stderr": 0.004325159736671571,
"acc": 0.5600850420632693,
"acc_stderr": 0.011402883443890944
},
"harness|drop|3": {
"em": 0.3109270134228188,
"em_stderr": 0.004740252668251192,
"f1": 0.47044567953020594,
"f1_stderr": 0.004325159736671571
},
"harness|gsm8k|5": {
"acc": 0.2835481425322214,
"acc_stderr": 0.012415070917508125
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ImanNalia/coraal_train_v2 | ---
dataset_info:
features:
- name: segment_filename
dtype: string
- name: text
dtype: string
- name: audio
struct:
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8632503027
num_examples: 11373
download_size: 8641855728
dataset_size: 8632503027
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shaaz10/j11 | ---
license: unknown
---
|
BramVanroy/test-dataset-dont-delete | ---
dataset_info:
features:
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 8856
num_examples: 4
download_size: 25365
dataset_size: 8856
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This dataset is a tiny subset of [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs), used for internal testing. |
lorenzoncina/embeddings_FAQ | ---
license: mit
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_211 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1299211988.0
num_examples: 253159
download_size: 1332413108
dataset_size: 1299211988.0
---
# Dataset Card for "chunk_211"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Den4ikAI/russian_code_qa | ---
license: mit
---
|
sethapun/arithmetic_2as_1to10 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: int64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 54740
num_examples: 2000
- name: validation
num_bytes: 10960
num_examples: 400
download_size: 11744
dataset_size: 65700
---
# Dataset Card for "arithmetic_2as_1to10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sleoruiz/dataset-tokenized-mdeberta | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2852368832
num_examples: 406552
- name: test
num_bytes: 713856952
num_examples: 101747
download_size: 319209887
dataset_size: 3566225784
---
# Dataset Card for "dataset-tokenized-mdeberta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GEM-submissions/lewtun__this-is-a-test-submission-2__1656667730 | ---
benchmark: gem
type: prediction
submission_name: This is a test submission 2
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test submission 2
|
open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B | ---
pretty_name: Evaluation run of teknium/CollectiveCognition-v1.1-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/CollectiveCognition-v1.1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T17:47:55.890655](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B/blob/main/results_2023-12-03T17-47-55.890655.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.35860500379075055,\n\
\ \"acc_stderr\": 0.01321031736413403\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.35860500379075055,\n \"acc_stderr\": 0.01321031736413403\n\
\ }\n}\n```"
repo_url: https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|arc:challenge|25_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|arc:challenge|25_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T18_24_08.168024
path:
- '**/details_harness|drop|3_2023-10-24T18-24-08.168024.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T18-24-08.168024.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T18_24_08.168024
path:
- '**/details_harness|gsm8k|5_2023-10-24T18-24-08.168024.parquet'
- split: 2023_12_03T17_43_05.326590
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-43-05.326590.parquet'
- split: 2023_12_03T17_47_55.890655
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-47-55.890655.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-47-55.890655.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hellaswag|10_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hellaswag|10_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T13-48-47.550072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T08-33-23.557832.parquet'
- split: 2023_11_08T13_48_47.550072
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-08T13-48-47.550072.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-08T13-48-47.550072.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T18_24_08.168024
path:
- '**/details_harness|winogrande|5_2023-10-24T18-24-08.168024.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T18-24-08.168024.parquet'
- config_name: results
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- results_2023-10-12T08-33-23.557832.parquet
- split: 2023_10_24T18_24_08.168024
path:
- results_2023-10-24T18-24-08.168024.parquet
- split: 2023_11_08T13_48_47.550072
path:
- results_2023-11-08T13-48-47.550072.parquet
- split: 2023_12_03T17_43_05.326590
path:
- results_2023-12-03T17-43-05.326590.parquet
- split: 2023_12_03T17_47_55.890655
path:
- results_2023-12-03T17-47-55.890655.parquet
- split: latest
path:
- results_2023-12-03T17-47-55.890655.parquet
---
# Dataset Card for Evaluation run of teknium/CollectiveCognition-v1.1-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/CollectiveCognition-v1.1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T17:47:55.890655](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B/blob/main/results_2023-12-03T17-47-55.890655.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.35860500379075055,
"acc_stderr": 0.01321031736413403
},
"harness|gsm8k|5": {
"acc": 0.35860500379075055,
"acc_stderr": 0.01321031736413403
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AIRI-NLP/quality_counter_new_3584 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 553896954
num_examples: 20000
- name: validation
num_bytes: 224676272
num_examples: 8000
- name: test
num_bytes: 56237858
num_examples: 2300
download_size: 26536911
dataset_size: 834811084
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
mstz/page_blocks | ---
language:
- en
tags:
- page_blocks
- tabular_classification
- binary_classification
- multiclass_classification
pretty_name: Page Blocks
size_categories:
- 1K<n<10K
task_categories:
- tabular-classification
configs:
- page_blocks
- page_blocks_binary
license: cc
---
# PageBlocks
The [PageBlocks dataset](https://archive-beta.ics.uci.edu/dataset/76/page_blocks) from the [UCI repository](https://archive-beta.ics.uci.edu/).
How many transitions does the page block have?
# Configurations and tasks
| **Configuration** | **Task** |
|-------------------|---------------------------|
| page_blocks | Multiclass classification |
| page_blocks_binary| Binary classification | |
rufimelo/PortugueseLegalSentences-v2 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- pt
license:
- apache-2.0
multilinguality:
- monolingual
source_datasets:
- original
---
# Portuguese Legal Sentences
Collection of Legal Sentences from the Portuguese Supreme Court of Justice
The goal of this dataset was to be used for MLM and TSDAE
Extended version of rufimelo/PortugueseLegalSentences-v1
200000/200000/100000
### Contributions
[@rufimelo99](https://github.com/rufimelo99)
|
jdapaah/asante-twi-bible | ---
language:
- ak
- tw
task_categories:
- automatic-speech-recognition
- translation
- text-to-speech
tags:
- asr
- africa
- language
- ml
- twi
- akan
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 45832792.0
num_examples: 60
download_size: 28889005
dataset_size: 45832792.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
<h1>Asante Twi Bible Audio</h1>
This dataset is comprised of audio recorded from [Youversion's website](https://www.bible.com/bible/2094/), which hosts audio and written copies of the Bible in multiple languages. It includes audio and matching transcriptions of the Bible, useful for Automatic Speech Recognition (ASR) and Speech Generation applications.
In its first iteration, it contains Romans 1 - 4 in the *Asante Twi Nkwa Asɛm* version. As more data is preprocessed, the dataset will grow to include more data. |
Doub7e/SDv2-GPT4Spatial-200-T5 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: T5_last_hidden_states
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 203072791.0
num_examples: 200
download_size: 204322556
dataset_size: 203072791.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SDv2-GPT4Spatial-200-T5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Shekswess/gemma_medical_meadow_wikidoc_instruct_dataset | ---
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- question-answering
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 22030922
num_examples: 9998
download_size: 11323025
dataset_size: 22030922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- medical
---
Dataset made for instruction supervised finetuning of Gemma LLMs based on the Medical meadow wikidoc dataset:
- Medical meadow wikidoc (https://huggingface.co/datasets/medalpaca/medical_meadow_wikidoc/blob/main/README.md)
## Medical meadow wikidoc
The Medical Meadow Wikidoc dataset comprises question-answer pairs sourced from WikiDoc, an online platform where medical professionals collaboratively contribute and share contemporary medical knowledge. WikiDoc features two primary sections: the "Living Textbook" and "Patient Information". The "Living Textbook" encompasses chapters across various medical specialties, from which we extracted content. Utilizing GTP-3.5-Turbo, the paragraph headings are transformed into questions and utilized the respective paragraphs as answers. Notably, the structure of "Patient Information" is distinct; each section's subheading already serves as a question, eliminating the necessity for rephrasing. |
deepghs/anime_ch_skin_color | ---
license: mit
task_categories:
- image-classification
tags:
- art
size_categories:
- 10K<n<100K
--- |
collabteza/sys-human_db2 | ---
dataset_info:
features:
- name: System Prompt
dtype: string
- name: Human Prompt
dtype: string
- name: Output
dtype: string
splits:
- name: train
num_bytes: 972089
num_examples: 1530
download_size: 460352
dataset_size: 972089
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sys-human_db2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ashigara_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ashigara/足柄/足柄 (Azur Lane)
This is the dataset of ashigara/足柄/足柄 (Azur Lane), containing 16 images and their tags.
The core tags of this character are `breasts, long_hair, animal_ears, red_eyes, bangs, headphones, hair_between_eyes, very_long_hair, hair_ornament, animal_ear_fluff, large_breasts, purple_hair, twintails, black_hair, blue_hair, hair_flower, cat_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 25.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashigara_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 13.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashigara_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 28.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashigara_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 21.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashigara_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 40.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ashigara_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ashigara_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | looking_at_viewer, short_sleeves, 1girl, blue_shirt, blue_skirt, blush, medium_breasts, miniskirt, pleated_skirt, solo, black_gloves, brown_thighhighs, crop_top, holding_sword, katana, midriff, sheathed, simple_background, thighs, white_background, white_sailor_collar, blue_serafuku, closed_mouth, collarbone, neckerchief, no_shoes, open_mouth, outdoors, sitting, smile |
| 1 | 7 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, flower, blush, navel, skindentation, smile, thigh_strap, thighs, bare_shoulders, choker, collarbone, front-tie_bikini_top, multi-strapped_bikini, open_mouth, side-tie_bikini_bottom, simple_background, white_background, wolf_ears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | short_sleeves | 1girl | blue_shirt | blue_skirt | blush | medium_breasts | miniskirt | pleated_skirt | solo | black_gloves | brown_thighhighs | crop_top | holding_sword | katana | midriff | sheathed | simple_background | thighs | white_background | white_sailor_collar | blue_serafuku | closed_mouth | collarbone | neckerchief | no_shoes | open_mouth | outdoors | sitting | smile | cleavage | flower | navel | skindentation | thigh_strap | bare_shoulders | choker | front-tie_bikini_top | multi-strapped_bikini | side-tie_bikini_bottom | wolf_ears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:----------------|:--------|:-------------|:-------------|:--------|:-----------------|:------------|:----------------|:-------|:---------------|:-------------------|:-----------|:----------------|:---------|:----------|:-----------|:--------------------|:---------|:-------------------|:----------------------|:----------------|:---------------|:-------------|:--------------|:-----------|:-------------|:-----------|:----------|:--------|:-----------|:---------|:--------|:----------------|:--------------|:-----------------|:---------|:-----------------------|:------------------------|:-------------------------|:------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | | | X | | | | X | | | | | | | | X | X | X | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
irds/mr-tydi_ar_train | ---
pretty_name: '`mr-tydi/ar/train`'
viewer: false
source_datasets: ['irds/mr-tydi_ar']
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/ar/train`
The `mr-tydi/ar/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/ar/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=12,377
- `qrels`: (relevance assessments); count=12,377
- For `docs`, use [`irds/mr-tydi_ar`](https://huggingface.co/datasets/irds/mr-tydi_ar)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mr-tydi_ar_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_ar_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
arieg/bw_spec_cls_80_24 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '57640'
'1': '57648'
'2': '57658'
'3': '57661'
'4': '57662'
'5': '57663'
'6': '57665'
'7': '57691'
'8': '57697'
'9': '57819'
'10': '57820'
'11': '57821'
'12': '57822'
'13': '57823'
'14': '57936'
'15': '57937'
'16': '57938'
'17': '57939'
'18': '57943'
'19': '57968'
'20': '58052'
'21': '58053'
'22': '58054'
'23': '58060'
'24': '58061'
'25': '58063'
'26': '58068'
'27': '58070'
'28': '58115'
'29': '58116'
'30': '58117'
'31': '58135'
'32': '58140'
'33': '58161'
'34': '58162'
'35': '58164'
'36': '58166'
'37': '58169'
'38': '58170'
'39': '58173'
'40': '58174'
'41': '58212'
'42': '58213'
'43': '58215'
'44': '58221'
'45': '58225'
'46': '58341'
'47': '58474'
'48': '59078'
'49': '59373'
'50': '59374'
'51': '59561'
'52': '59653'
'53': '59654'
'54': '59656'
'55': '59657'
'56': '59658'
'57': '59659'
'58': '59660'
'59': '59663'
'60': '59664'
'61': '59666'
'62': '59667'
'63': '59669'
'64': '59671'
'65': '59673'
'66': '59675'
'67': '59676'
'68': '59677'
'69': '59678'
'70': '59679'
'71': '59680'
'72': '59681'
'73': '59682'
'74': '59683'
'75': '59684'
'76': '59685'
'77': '59686'
'78': '59687'
'79': '59688'
splits:
- name: train
num_bytes: 87569851.2
num_examples: 1600
- name: test
num_bytes: 22682287.0
num_examples: 400
download_size: 113474750
dataset_size: 110252138.2
---
# Dataset Card for "bw_spec_cls_80_24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sergeipetrov/transformers-diffusers-docs-embed | ---
dataset_info:
features:
- name: vector
sequence: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 33821040
num_examples: 3824
download_size: 33494533
dataset_size: 33821040
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/squad_qa_rare_v5_full_recite_ans_sent | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7798024
num_examples: 5070
- name: validation
num_bytes: 405531
num_examples: 300
download_size: 0
dataset_size: 8203555
---
# Dataset Card for "squad_qa_rare_v5_full_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_mncai__agiin-13.6B-v0.1 | ---
pretty_name: Evaluation run of mncai/agiin-13.6B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/agiin-13.6B-v0.1](https://huggingface.co/mncai/agiin-13.6B-v0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__agiin-13.6B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T16:35:40.891850](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__agiin-13.6B-v0.1/blob/main/results_2023-12-16T16-35-40.891850.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6140808996502091,\n\
\ \"acc_stderr\": 0.03322600041693132,\n \"acc_norm\": 0.6172006340341523,\n\
\ \"acc_norm_stderr\": 0.033898195854611735,\n \"mc1\": 0.5214198286413708,\n\
\ \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6797310501619931,\n\
\ \"mc2_stderr\": 0.015395432575157594\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6672354948805461,\n \"acc_stderr\": 0.013769863046192302,\n\
\ \"acc_norm\": 0.6945392491467577,\n \"acc_norm_stderr\": 0.013460080478002508\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6861183031268672,\n\
\ \"acc_stderr\": 0.004631205099684944,\n \"acc_norm\": 0.8663612826130253,\n\
\ \"acc_norm_stderr\": 0.0033956833380563364\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.02964781353936525,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.02964781353936525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110175,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110175\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.037038511930995215,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.037038511930995215\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\
\ \"acc_stderr\": 0.025378139970885203,\n \"acc_norm\": 0.7258064516129032,\n\
\ \"acc_norm_stderr\": 0.025378139970885203\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117467,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635474,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635474\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612896,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612896\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654366,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654366\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489267,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489267\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7573435504469987,\n\
\ \"acc_stderr\": 0.015329888940899867,\n \"acc_norm\": 0.7573435504469987,\n\
\ \"acc_norm_stderr\": 0.015329888940899867\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647886,\n\
\ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647886\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n\
\ \"acc_stderr\": 0.016669799592112025,\n \"acc_norm\": 0.46033519553072627,\n\
\ \"acc_norm_stderr\": 0.016669799592112025\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.027245613047215355,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.027245613047215355\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717163,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786558,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786558\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789845,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789845\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5214198286413708,\n\
\ \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6797310501619931,\n\
\ \"mc2_stderr\": 0.015395432575157594\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722743\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.46474601971190294,\n \
\ \"acc_stderr\": 0.01373820799017732\n }\n}\n```"
repo_url: https://huggingface.co/mncai/agiin-13.6B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|arc:challenge|25_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|gsm8k|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hellaswag|10_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-35-40.891850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T16-35-40.891850.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- '**/details_harness|winogrande|5_2023-12-16T16-35-40.891850.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T16-35-40.891850.parquet'
- config_name: results
data_files:
- split: 2023_12_16T16_35_40.891850
path:
- results_2023-12-16T16-35-40.891850.parquet
- split: latest
path:
- results_2023-12-16T16-35-40.891850.parquet
---
# Dataset Card for Evaluation run of mncai/agiin-13.6B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mncai/agiin-13.6B-v0.1](https://huggingface.co/mncai/agiin-13.6B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__agiin-13.6B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T16:35:40.891850](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__agiin-13.6B-v0.1/blob/main/results_2023-12-16T16-35-40.891850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6140808996502091,
"acc_stderr": 0.03322600041693132,
"acc_norm": 0.6172006340341523,
"acc_norm_stderr": 0.033898195854611735,
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6797310501619931,
"mc2_stderr": 0.015395432575157594
},
"harness|arc:challenge|25": {
"acc": 0.6672354948805461,
"acc_stderr": 0.013769863046192302,
"acc_norm": 0.6945392491467577,
"acc_norm_stderr": 0.013460080478002508
},
"harness|hellaswag|10": {
"acc": 0.6861183031268672,
"acc_stderr": 0.004631205099684944,
"acc_norm": 0.8663612826130253,
"acc_norm_stderr": 0.0033956833380563364
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.02964781353936525,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.02964781353936525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110175,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110175
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.037038511930995215,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.037038511930995215
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885203,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885203
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612896,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612896
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654366,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654366
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489267,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489267
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7573435504469987,
"acc_stderr": 0.015329888940899867,
"acc_norm": 0.7573435504469987,
"acc_norm_stderr": 0.015329888940899867
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647886,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647886
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.016669799592112025,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.016669799592112025
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.027245613047215355,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.027245613047215355
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717163,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676008,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786558,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789845,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789845
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6797310501619931,
"mc2_stderr": 0.015395432575157594
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722743
},
"harness|gsm8k|5": {
"acc": 0.46474601971190294,
"acc_stderr": 0.01373820799017732
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
rirv938/dummy_data | ---
dataset_info:
features:
- name: dummy
dtype: int64
splits:
- name: train
num_bytes: 8
num_examples: 1
download_size: 845
dataset_size: 8
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DarqueDante/megamerge | ---
dataset_info:
features:
- name: text_token_length
dtype: int64
- name: prompt
dtype: string
- name: text
dtype: string
- name: seed_data
dtype: string
- name: format
dtype: string
- name: audience
dtype: string
splits:
- name: train
num_bytes: 69075726295
num_examples: 12426348
download_size: 38943888490
dataset_size: 69075726295
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7 | ---
pretty_name: Evaluation run of nicholasKluge/Aira-Instruct-PT-1B7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-Instruct-PT-1B7](https://huggingface.co/nicholasKluge/Aira-Instruct-PT-1B7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T20:59:57.404122](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7/blob/main/results_2023-08-09T20%3A59%3A57.404122.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2495089085448401,\n\
\ \"acc_stderr\": 0.03135286921160441,\n \"acc_norm\": 0.2508452551647926,\n\
\ \"acc_norm_stderr\": 0.03137437179137316,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055028,\n \"mc2\": 0.4595409979303444,\n\
\ \"mc2_stderr\": 0.01663090921738331\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705583,\n\
\ \"acc_norm\": 0.2687713310580205,\n \"acc_norm_stderr\": 0.012955065963710672\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25941047600079664,\n\
\ \"acc_stderr\": 0.004374153847826759,\n \"acc_norm\": 0.2725552678749253,\n\
\ \"acc_norm_stderr\": 0.004443639394177424\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708087,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708087\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234102,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2870967741935484,\n \"acc_stderr\": 0.025736542745594525,\n \"\
acc_norm\": 0.2870967741935484,\n \"acc_norm_stderr\": 0.025736542745594525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"\
acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270286,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.03027690994517826,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.03027690994517826\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.344954128440367,\n \"acc_stderr\": 0.020380605405066966,\n \"\
acc_norm\": 0.344954128440367,\n \"acc_norm_stderr\": 0.020380605405066966\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.029105220833224615,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.029105220833224615\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.21487603305785125,\n \"acc_stderr\": 0.03749492448709698,\n \"\
acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.03749492448709698\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531773,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531773\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.02361867831006937,\n\
\ \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.02361867831006937\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02392915551735129,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02392915551735129\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21543408360128619,\n\
\ \"acc_stderr\": 0.023350225475471425,\n \"acc_norm\": 0.21543408360128619,\n\
\ \"acc_norm_stderr\": 0.023350225475471425\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.02346842983245114,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.02346842983245114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180844,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322263,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724138,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724138\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.027529637440174934,\n\
\ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.027529637440174934\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055028,\n \"mc2\": 0.4595409979303444,\n\
\ \"mc2_stderr\": 0.01663090921738331\n }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-Instruct-PT-1B7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|arc:challenge|25_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hellaswag|10_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:59:57.404122.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:59:57.404122.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T20:59:57.404122.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T20:59:57.404122.parquet'
- config_name: results
data_files:
- split: 2023_08_09T20_59_57.404122
path:
- results_2023-08-09T20:59:57.404122.parquet
- split: latest
path:
- results_2023-08-09T20:59:57.404122.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-PT-1B7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-Instruct-PT-1B7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-PT-1B7](https://huggingface.co/nicholasKluge/Aira-Instruct-PT-1B7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T20:59:57.404122](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7/blob/main/results_2023-08-09T20%3A59%3A57.404122.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2495089085448401,
"acc_stderr": 0.03135286921160441,
"acc_norm": 0.2508452551647926,
"acc_norm_stderr": 0.03137437179137316,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055028,
"mc2": 0.4595409979303444,
"mc2_stderr": 0.01663090921738331
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705583,
"acc_norm": 0.2687713310580205,
"acc_norm_stderr": 0.012955065963710672
},
"harness|hellaswag|10": {
"acc": 0.25941047600079664,
"acc_stderr": 0.004374153847826759,
"acc_norm": 0.2725552678749253,
"acc_norm_stderr": 0.004443639394177424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708087,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708087
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.028185441301234102,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.028185441301234102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2870967741935484,
"acc_stderr": 0.025736542745594525,
"acc_norm": 0.2870967741935484,
"acc_norm_stderr": 0.025736542745594525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.03027690994517826,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.03027690994517826
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.344954128440367,
"acc_stderr": 0.020380605405066966,
"acc_norm": 0.344954128440367,
"acc_norm_stderr": 0.020380605405066966
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.029105220833224615,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.029105220833224615
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531773,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531773
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.02361867831006937,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.02361867831006937
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21543408360128619,
"acc_stderr": 0.023350225475471425,
"acc_norm": 0.21543408360128619,
"acc_norm_stderr": 0.023350225475471425
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.02346842983245114,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.02346842983245114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180844,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322263,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724138,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724138
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.033773102522091945,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.033773102522091945
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055028,
"mc2": 0.4595409979303444,
"mc2_stderr": 0.01663090921738331
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Codec-SUPERB/gunshot_triangulation_extract_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 214680
num_examples: 88
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 214680
num_examples: 88
- name: academicodec_hifi_24k_320d
num_bytes: 318872
num_examples: 88
- name: audiodec_24k_320d
num_bytes: 680728
num_examples: 88
- name: dac_16k
num_bytes: 1442456
num_examples: 88
- name: dac_24k
num_bytes: 4000792
num_examples: 88
- name: dac_44k
num_bytes: 1373816
num_examples: 88
- name: encodec_24k
num_bytes: 161880
num_examples: 88
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 1725464
num_examples: 88
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 1725464
num_examples: 88
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 1702936
num_examples: 88
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 869400
num_examples: 88
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 1702936
num_examples: 88
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 1702936
num_examples: 88
- name: speech_tokenizer_16k
num_bytes: 427288
num_examples: 88
download_size: 2845431
dataset_size: 18264328
---
# Dataset Card for "gunshot_triangulation_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
codeparrot/github-jupyter | ---
annotations_creators: []
language_creators:
- crowdsourced
- expert-generated
language:
- code
license:
- other
multilinguality:
- muonolingual
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids:
- language-modeling
---
# GitHub Jupyter Dataset
## Dataset Description
The dataset was extracted from Jupyter Notebooks on BigQuery.
## Licenses
Each example has the license of its associated repository. There are in total 15 licenses:
```python
[
'mit',
'apache-2.0',
'gpl-3.0',
'gpl-2.0',
'bsd-3-clause',
'agpl-3.0',
'lgpl-3.0',
'lgpl-2.1',
'bsd-2-clause',
'cc0-1.0',
'epl-1.0',
'mpl-2.0',
'unlicense',
'isc',
'artistic-2.0'
]
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.