datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
PL-MTEB/plsc-clustering-p2p | ---
license: cc0-1.0
---
|
Kachu/joaoporrafinal | ---
license: openrail
---
|
CyberHarem/nanami_touko_yagatekimininaru | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Nanami Touko
This is the dataset of Nanami Touko, containing 298 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 298 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 683 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 826 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 298 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 298 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 298 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 683 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 683 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 595 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 826 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 826 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
felipesampaio2010/ShaunTaylorCorbett | ---
license: openrail
---
|
DIAS123/vozingles | ---
license: openrail
---
|
Ukhushn/home-depot | ---
language:
- en
language_bcp47:
- en-US
license:
- afl-3.0
annotations_creators:
- no-annotation
language_creators:
- found
multilinguality:
- monolingual
pretty_name: Ukhushn/home-depot
size_categories:
- 10K<n<100K
source_datasets: []
task_categories:
- sentence-similarity
task_ids: []
---
# Dataset Card for Ukhushn/home-depot
|
michaelginn/latent-trees-agreement-ID | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: depth
dtype: int64
splits:
- name: train
num_bytes: 107176.8
num_examples: 2400
- name: eval
num_bytes: 35725.6
num_examples: 800
- name: test
num_bytes: 35725.6
num_examples: 800
download_size: 56457
dataset_size: 178628.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
---
# Dataset Card for "latent-trees-agreement-ID"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
japanese-asr/whisper_transcriptions.reazonspeech.tiny.wer_10.0.vectorized | ---
dataset_info:
config_name: tiny
features:
- name: input_length
dtype: int64
- name: labels
sequence: int64
- name: input_features
sequence:
sequence: float32
splits:
- name: train
num_bytes: 2716883232
num_examples: 1768
download_size: 506052787
dataset_size: 2716883232
configs:
- config_name: tiny
data_files:
- split: train
path: tiny/train-*
---
|
ftang97/sw-consultancy-agent | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2311272.0
num_examples: 282
- name: test
num_bytes: 262272.0
num_examples: 32
download_size: 1195336
dataset_size: 2573544.0
---
# Dataset Card for "sw-consultancy-agent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
avsolatorio/mteb-toxic_conversations_50k-avs_triplets | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
- name: idx
dtype: int64
- name: query_idx
dtype: int64
- name: positive_idx
dtype: int64
- name: negative_idx
dtype: int64
splits:
- name: train
num_bytes: 17791088
num_examples: 50000
download_size: 11682866
dataset_size: 17791088
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# MTEB Toxic Conversations 50k Triplets Dataset
This dataset was used in the paper GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning. Refer to https://arxiv.org/abs/2402.16829 for details.
The code for generating the data is available at https://github.com/avsolatorio/GISTEmbed/blob/main/scripts/create_classification_dataset.py.
## Citation
```
@article{solatorio2024gistembed,
title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
author={Aivin V. Solatorio},
journal={arXiv preprint arXiv:2402.16829},
year={2024},
URL={https://arxiv.org/abs/2402.16829}
eprint={2402.16829},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
ariscult/ageternalsunshine | ---
license: openrail
---
|
ctoraman/gender-hate-speech | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
language:
- en
tags:
- hate speech
- hate speech detection
- hate-speech
- tweets
- social media
- hate-speech-detection
- gender identity
- gender
---
The "gender identity" subset of the large-scale dataset published in the LREC 2022 paper "Large-Scale Hate Speech Detection with Cross-Domain Transfer".
This subset is used in the experiments of "Şahinuç, F., Yilmaz, E. H., Toraman, C., & Koç, A. (2023). The effect of gender bias on hate speech detection. Signal, Image and Video Processing, 17(4), 1591-1597."
The "gender identity" subset includes 20,000 tweets in English.
The published data split is the first fold of 10-fold cross-validation, used in the experiments mentioned above.
Train split has 18,000 tweets. Test split has 2,000 tweets.
HateLabel:
- 0 Normal
- 1 Offensive
- 2 Hate
# GitHub Repo:
https://github.com/avaapm/hatespeech
# If you use this dataset, please cite the following papers:
- Toraman, C., Şahinuç, F., & Yilmaz, E. (2022, June). Large-Scale Hate Speech Detection with Cross-Domain Transfer. In Proceedings of the Thirteenth Language Resources and Evaluation Conference (pp. 2215-2225).
- Şahinuç, F., Yilmaz, E. H., Toraman, C., & Koç, A. (2023). The effect of gender bias on hate speech detection. Signal, Image and Video Processing, 17(4), 1591-1597. |
sruly/search_training_data.csv | ---
license: apache-2.0
---
|
joey234/imdb_affix_neg | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: words_with_affixes
sequence: string
splits:
- name: test
num_bytes: 40896361
num_examples: 18618
download_size: 11872416
dataset_size: 40896361
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "imdb_affix_neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fuyu-quant/ibl-regression-ver4-branch-pred | ---
dataset_info:
features:
- name: prediction
dtype: string
- name: 'true'
dtype: string
- name: index
dtype: int64
splits:
- name: pred
num_bytes: 227616
num_examples: 1000
download_size: 57462
dataset_size: 227616
configs:
- config_name: default
data_files:
- split: pred
path: data/pred-*
---
|
MuratcanKoylan/MarketingStructuralPrompts | ---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- marketing
- prompting
- template
size_categories:
- 1K<n<10K
---
# README.md
## Enhancing Large Language Model Performance in Digital Marketing Strategies with a Specialized Prompt Dataset
### Creator: Muratcan Koylan
---
### About the Dataset
This dataset, comprising 4,643 specialized prompts across various categories of digital marketing, aims to enhance the performance of Large Language Models (LLMs) like GPT-3 in generating accurate, relevant, and industry-specific marketing strategies.
30 Paid Search Prompts
15 ROAS Prompts
45 Facebook Ads Prompts
13 Google Remarketing Prompts
15 Ad Network Prompts
14 Linkedin Ads Promtps
14 Advertising Budget Prompts
16 Quality Score Prompts
14 BING Ads Prompts
15 Classified Advertising Prompts
20 CPM Prompts
15 X (Twitter) Prompts
15 CPC Prompts
15 PPC Prompts
15 Instagram Ads Prompts
15 Youtube Ads Prompts
15 Google Ads Prompts
15 Programmatic Advertising Prompts
15 Remarketing Promtps
15 CPV Prompts
15 Reach Promtps
15 CPL Prompts
15 Ad Rank Prompts
15 Interstitial Prompts
15 Ad Sense Prompts
15 SEM Prompts
20 Affiliates Prompts
15 Dsiplay Advertisement Promtps
20 Video Ads Promtps
20 Mobile Ads Prompts
20 TikTok Ads Promtps
20 Pinterest Ads Prompts
20 Shopping Ads Promtps
#### Dataset Composition:
- **StrategyDomain**: Main category representing the broader strategic area of digital marketing.
- **TacticScope**: Sub-category focusing on specific tactics within the StrategyDomain.
- **StrategicPrompt**: The actual marketing prompt text designed to simulate real-world marketing scenarios.
#### Methodology:
The dataset represents a synergistic fusion of human expertise and advanced AI technology, blending 30% human-generated content with 70% synthetic data crafted using cutting-edge generative AI models like GPT-4, Claude2, and LLama2. This approach strategically leverages the nuanced creativity and contextual understanding of human input, while exponentially expanding the dataset's breadth and depth through AI's vast generative capabilities. This methodology ensures the dataset embodies both the rich, detailed insights of human marketing experts and the diverse, innovative perspectives that AI models can offer.
#### Applications:
- **Fine-Tuning LLMs**: This dataset is pivotal for refining LLMs to produce more targeted, effective marketing strategies. By exposing LLMs to a diverse array of real-world marketing scenarios, they become adept at crafting nuanced and strategically sound solutions.
- **Marketing Campaign Development**: A valuable tool for marketers, this dataset aids in the ideation and development of comprehensive marketing campaigns, offering inspiration and strategic guidance.
- **Training AI Agents**: Ideal for training AI agents to autonomously handle various digital marketing tasks, this dataset can drive efficiency and innovation in marketing automation.
- **Cross-Domain Potential**: Beyond marketing, this dataset's structure and approach hold potential for adaptation and application in sectors like finance, healthcare, and education, where specialized language models can offer significant value.
---
### Experimental Results
Upon rigorous testing against standard LLM benchmarks, the dataset has demonstrated remarkable improvements in producing strategically relevant, creatively rich, and platform-specific accurate marketing content. These results underscore the dataset's efficacy in enhancing the contextual and strategic understanding of LLMs within the realm of digital marketing. Results will be shared in the near future with a proper paper.
---
### Future Directions
Looking ahead, the goal is to continuously evolve and enrich this dataset, incorporating emerging marketing trends and novel concepts. This ongoing development aims to broaden the dataset's utility, making it an indispensable tool for future LLM applications in digital marketing and beyond, including potential cross-disciplinary applications that push the boundaries of AI's role in various professional fields.
---
### Contact and Collaboration
As a fervent advocate for AI-driven innovation in marketing, I welcome collaboration and dialogue with fellow AI enthusiasts, marketers, and builders. My aim is to foster a community of like-minded professionals who are passionate about exploring the frontiers of AI in marketing. Reach out to me on X (@youraimarketer) for any collaboration ideas, discussions, or queries regarding this dataset.
---
### Acknowledgments
This dataset stands as a testament to the power of collaborative innovation, combining the best of human creativity and AI's transformative capabilities. A heartfelt thank you to all the contributors, including AI developers, data scientists, and marketing experts, whose collective efforts have brought this project to fruition.
--- |
KagglingFace/FYP-KiTS-A-Trimmed-Preprocess-Colab | ---
license: mit
---
|
vwxyzjn/lm-human-preferences | ---
license: mit
---
|
mfumanelli/pokemon-description-xs | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 2839
num_examples: 20
download_size: 4230
dataset_size: 2839
---
# Dataset Card for "pokemon-description-xs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nikhilno1/guide | ---
license: apache-2.0
language:
- en
pretty_name: User Guide
--- |
enesxgrahovac/the-feynman-lectures-on-physics | ---
dataset_info:
features:
- name: book_volume
dtype: string
- name: book_title
dtype: string
- name: chapter_number
dtype: string
- name: chapter_title
dtype: string
- name: section_number
dtype: string
- name: section_title
dtype: string
- name: section_text
dtype: string
splits:
- name: train
num_bytes: 4609643
num_examples: 641
download_size: 2276758
dataset_size: 4609643
---
# Dataset Card for "the-feynman-lectures-on-physics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Felipefloke/samantha2.0 | ---
license: openrail
---
|
joey234/mmlu-college_computer_science-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 25638
num_examples: 30
download_size: 20298
dataset_size: 25638
---
# Dataset Card for "mmlu-college_computer_science-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Wannita/PyCoder | ---
license: mit
datasets:
- Wannita/PyCoder
metrics:
- accuracy
- bleu
- meteor
- exact_match
- rouge
library_name: transformers
pipeline_tag: text-generation
task_categories:
- text-generation
tags:
- code
---
# PyCoder
This repository contains the dataset for the paper [Syntax-Aware On-the-Fly Code Completion](https://arxiv.org/abs/2211.04673)
The sample code to run the model can be found in directory: "`assets/notebooks/inference.ipynb`" in our GitHub: https://github.com/awsm-research/pycoder.
PyCoder is an auto code completion model which leverages a Multi-Task Training technique (MTT) to cooperatively
learn the code prediction task and the type prediction task. For the type prediction
task, we propose to leverage the standard Python token
type information (e.g., String, Number, Name, Keyword),
which is readily available and lightweight, instead of using
the AST information which requires source code to be parsable for an extraction, limiting its ability to perform on-the-fly code completion (see Section 2.3 in our paper).
More information can be found in our paper.
If you use our code or PyCoder, please cite our paper.
<pre><code>@article{takerngsaksiri2022syntax,
title={Syntax-Aware On-the-Fly Code Completion},
author={Takerngsaksiri, Wannita and Tantithamthavorn, Chakkrit and Li, Yuan-Fang},
journal={arXiv preprint arXiv:2211.04673},
year={2022}
}</code></pre>
|
MASTERREDE/minhavoz100 | ---
license: openrail
---
|
adamo1139/toxic-dpo-natural-v4 | ---
license: other
license_name: other
license_link: LICENSE
---
|
tyzhu/fw_num_train_10000_eval_100 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 1318323
num_examples: 20100
- name: eval_find_word
num_bytes: 4823
num_examples: 100
download_size: 510406
dataset_size: 1323146
---
# Dataset Card for "fw_num_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ethan0927/Clone-voiceone | ---
license: agpl-3.0
---
|
cestwc/text_classification | ---
dataset_info:
- config_name: ag_news
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': World
'1': Sports
'2': Business
'3': Sci/Tech
- name: glove
sequence: float64
- name: word2vec
sequence: float64
- name: fasttext
sequence: float64
splits:
- name: train
num_bytes: 747787977
num_examples: 127600
download_size: 717630530
dataset_size: 747787977
- config_name: amazon_reviews
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: glove
sequence: float64
- name: word2vec
sequence: float64
- name: fasttext
sequence: float64
splits:
- name: train
num_bytes: 1218704865
num_examples: 210000
download_size: 1147756545
dataset_size: 1218704865
- config_name: emotion
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': sadness
'1': joy
'2': love
'3': anger
'4': fear
'5': surprise
- name: fasttext
sequence: float64
- name: glove
sequence: float64
- name: word2vec
sequence: float64
splits:
- name: train
num_bytes: 114413401
num_examples: 20000
download_size: 104458522
dataset_size: 114413401
- config_name: imdb
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: fasttext
sequence: float64
- name: glove
sequence: float64
- name: word2vec
sequence: float64
splits:
- name: train
num_bytes: 346683508
num_examples: 50000
download_size: 344879514
dataset_size: 346683508
- config_name: multi_nli
features:
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: text
dtype: string
- name: glove
sequence: float64
- name: word2vec
sequence: float64
- name: fasttext
sequence: float64
splits:
- name: train
num_bytes: 2389531917
num_examples: 412349
download_size: 2243248541
dataset_size: 2389531917
- config_name: tweet_eval
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
- name: glove
sequence: float64
- name: word2vec
sequence: float64
- name: fasttext
sequence: float64
splits:
- name: train
num_bytes: 343075422
num_examples: 59899
download_size: 315331899
dataset_size: 343075422
- config_name: yelp_review_full
features:
- name: label
dtype:
class_label:
names:
'0': 1 star
'1': 2 star
'2': 3 stars
'3': 4 stars
'4': 5 stars
- name: text
dtype: string
- name: glove
sequence: float64
- name: word2vec
sequence: float64
- name: fasttext
sequence: float64
splits:
- name: train
num_bytes: 4449129014
num_examples: 700000
download_size: 4414593456
dataset_size: 4449129014
configs:
- config_name: ag_news
data_files:
- split: train
path: ag_news/train-*
- config_name: amazon_reviews
data_files:
- split: train
path: amazon_reviews/train-*
- config_name: emotion
data_files:
- split: train
path: emotion/train-*
- config_name: imdb
data_files:
- split: train
path: imdb/train-*
- config_name: multi_nli
data_files:
- split: train
path: multi_nli/train-*
- config_name: tweet_eval
data_files:
- split: train
path: tweet_eval/train-*
- config_name: yelp_review_full
data_files:
- split: train
path: yelp_review_full/train-*
---
|
ammarnasr/the-stack-swift-clean | ---
license: openrail
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 3582248477.9086223
num_examples: 806789
- name: test
num_bytes: 394048264.9973618
num_examples: 88747
- name: valid
num_bytes: 3982797.09401595
num_examples: 897
download_size: 1323156008
dataset_size: 3980279540
task_categories:
- text-generation
language:
- code
tags:
- code
pretty_name: TheStack-Swift
size_categories:
- 1M<n<10M
---
## Dataset 1: TheStack - Swift - Cleaned
**Description**: This dataset is drawn from TheStack Corpus, an open-source code dataset with over 3TB of GitHub data covering 48 programming languages. We selected a small portion of this dataset to optimize smaller language models for Swift, a popular statically typed language.
**Target Language**: Swift
**Dataset Size**:
- Training: 900,000 files
- Validation: 50,000 files
- Test: 50,000 files
**Preprocessing**:
1. Selected Swift as the target language due to its popularity on GitHub.
2. Filtered out files with average line length > 100 characters, maximum line length > 1000 characters, and alphabet ratio < 25%.
3. Split files into 90% training, 5% validation, and 5% test sets.
**Tokenizer**: Byte Pair Encoding (BPE) tokenizer with tab and whitespace tokens. GPT-2 vocabulary extended with special tokens.
**Training Sequences**: Sequences constructed by joining training data text to reach a context length of 2048 tokens (1024 tokens for full fine-tuning). |
DataStudio/OCRWordLevelClear_05 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 28029345964.881
num_examples: 6777331
download_size: 26708844788
dataset_size: 28029345964.881
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
harishmukkapati/softwareOne | ---
license: mit
---
|
HuggingFaceH4/instruction-dataset | ---
license: apache-2.0
---
This is the blind eval dataset of high-quality, diverse, human-written instructions with demonstrations. We will be using this for step 3 evaluations in our RLHF pipeline. |
CyberHarem/mutsuki_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mutsuki/浅黄ムツキ/睦月 (Blue Archive)
This is the dataset of mutsuki/浅黄ムツキ/睦月 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `long_hair, halo, purple_eyes, hair_ornament, white_hair, side_ponytail, grey_hair, pointy_ears, hair_flower, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 943.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mutsuki_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 778.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mutsuki_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1400 | 1.62 GiB | [Download](https://huggingface.co/datasets/CyberHarem/mutsuki_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mutsuki_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, red_skirt, short_sleeves, solo, blush, frilled_skirt, plaid_skirt, shirt, simple_background, black_jacket, white_background, black_flower, grin, thigh_strap, neck_ribbon |
| 1 | 14 |  |  |  |  |  | looking_at_viewer, 1girl, alternate_costume, red_necktie, solo, collared_shirt, long_sleeves, simple_background, white_background, white_shirt, white_socks, black_footwear, shoes, black_jacket, black_shorts, blush, full_body, grin, holding, pink_eyes, kneehighs, open_mouth |
| 2 | 28 |  |  |  |  |  | 1girl, solo, fishnet_pantyhose, red_dress, looking_at_viewer, red_halo, alternate_costume, simple_background, blush, white_background, bracelet, small_breasts, open_mouth, black_hairband, earrings, sleeveless_dress, bare_shoulders, black_footwear, grin |
| 3 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, sitting, small_breasts, solo, very_long_hair, :q, closed_mouth, collarbone, completely_nude, navel, nipples, smile, licking_lips, loli, red_halo, black_flower, feet_out_of_frame, heart, pussy, simple_background, spread_legs, stomach, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, blush, completely_nude, looking_at_viewer, navel, nipples, pussy, small_breasts, solo, spread_legs, collarbone, indoors, sitting, stomach, anus, mosaic_censoring, open_mouth, uncensored, :d, armpits, flower, grin, loli, pink_eyes, ribs, sweat, teeth |
| 5 | 6 |  |  |  |  |  | 1boy, 1girl, blush, completely_nude, hetero, loli, navel, nipples, penis, sex, solo_focus, spread_legs, vaginal, looking_at_viewer, on_back, collarbone, missionary, open_mouth, bar_censor, bed_sheet, cum_in_pussy, cum_overflow, flat_chest, heart, pov_crotch, small_breasts, smile |
| 6 | 8 |  |  |  |  |  | 1boy, 1girl, blush, completely_nude, girl_on_top, hetero, navel, nipples, sex, solo_focus, vaginal, cowgirl_position, small_breasts, loli, penis, smile, sweat, cum_in_pussy, flower, open_mouth, cum_overflow, mosaic_censoring, collarbone, flat_chest, heart, looking_at_viewer, tongue_out |
| 7 | 36 |  |  |  |  |  | 1girl, hair_bun, obi, official_alternate_costume, pink_flower, white_kimono, wide_sleeves, long_sleeves, looking_at_viewer, solo, blush, grin, pink_eyes, holding, simple_background, white_background |
| 8 | 6 |  |  |  |  |  | 1girl, blush, collarbone, indoors, looking_at_viewer, navel, sitting, small_breasts, smile, solo, stomach, thighs, bare_shoulders, bow_panties, closed_mouth, underwear_only, window, bra, cameltoe, knee_up, spread_legs, black_panties, on_couch |
| 9 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, looking_at_viewer, penis, solo_focus, fellatio, nude, bar_censor, erection, nipples, pov_crotch, simple_background, white_background |
| 10 | 10 |  |  |  |  |  | 1girl, alternate_costume, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, small_breasts, strapless_leotard, bare_shoulders, blush, detached_collar, solo, simple_background, wrist_cuffs, red_leotard, very_long_hair, white_background, pink_eyes, red_halo, black_pantyhose, covered_navel, fake_tail, full_body, grin, heart, high_heels, open_mouth, rabbit_tail |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | red_skirt | short_sleeves | solo | blush | frilled_skirt | plaid_skirt | shirt | simple_background | black_jacket | white_background | black_flower | grin | thigh_strap | neck_ribbon | alternate_costume | red_necktie | collared_shirt | long_sleeves | white_shirt | white_socks | black_footwear | shoes | black_shorts | full_body | holding | pink_eyes | kneehighs | open_mouth | fishnet_pantyhose | red_dress | red_halo | bracelet | small_breasts | black_hairband | earrings | sleeveless_dress | bare_shoulders | sitting | very_long_hair | :q | closed_mouth | collarbone | completely_nude | navel | nipples | smile | licking_lips | loli | feet_out_of_frame | heart | pussy | spread_legs | stomach | indoors | anus | mosaic_censoring | uncensored | :d | armpits | flower | ribs | sweat | teeth | 1boy | hetero | penis | sex | solo_focus | vaginal | on_back | missionary | bar_censor | bed_sheet | cum_in_pussy | cum_overflow | flat_chest | pov_crotch | girl_on_top | cowgirl_position | tongue_out | hair_bun | obi | official_alternate_costume | pink_flower | white_kimono | wide_sleeves | thighs | bow_panties | underwear_only | window | bra | cameltoe | knee_up | black_panties | on_couch | fellatio | nude | erection | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | detached_collar | wrist_cuffs | red_leotard | black_pantyhose | covered_navel | fake_tail | high_heels | rabbit_tail |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------|:------------|:----------------|:-------|:--------|:----------------|:--------------|:--------|:--------------------|:---------------|:-------------------|:---------------|:-------|:--------------|:--------------|:--------------------|:--------------|:-----------------|:---------------|:--------------|:--------------|:-----------------|:--------|:---------------|:------------|:----------|:------------|:------------|:-------------|:--------------------|:------------|:-----------|:-----------|:----------------|:-----------------|:-----------|:-------------------|:-----------------|:----------|:-----------------|:-----|:---------------|:-------------|:------------------|:--------|:----------|:--------|:---------------|:-------|:--------------------|:--------|:--------|:--------------|:----------|:----------|:-------|:-------------------|:-------------|:-----|:----------|:---------|:-------|:--------|:--------|:-------|:---------|:--------|:------|:-------------|:----------|:----------|:-------------|:-------------|:------------|:---------------|:---------------|:-------------|:-------------|:--------------|:-------------------|:-------------|:-----------|:------|:-----------------------------|:--------------|:---------------|:---------------|:---------|:--------------|:-----------------|:---------|:------|:-----------|:----------|:----------------|:-----------|:-----------|:-------|:-----------|:-------------------|:----------------|:--------------|:--------------------|:------------------|:--------------|:--------------|:------------------|:----------------|:------------|:-------------|:--------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | | | X | X | | | | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 28 |  |  |  |  |  | X | X | | | X | X | | | | X | | X | | X | | | X | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | X | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | | X | X | | | | | | | | X | | | | | | | | | | | | | | X | | X | | | | | X | | | | | X | | | | X | X | X | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | X | X | X | X | X | | X | | X | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | X | X | X | X | X | | X | | X | | | | | | X | | | | X | | X | | X | X | X | X | X | X | | | | | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 36 |  |  |  |  |  | X | X | | | X | X | | | | X | | X | | X | | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | | | X | X | | X | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | X | | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | |
| 10 | 10 |  |  |  |  |  | X | X | | | X | X | | | | X | | X | | X | | | X | | | | | | | | | X | | X | | X | | | X | | X | | | | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
microsoft/LCC_python | ---
dataset_info:
features:
- name: gt
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1761900743
num_examples: 100000
- name: validation
num_bytes: 146577328
num_examples: 10000
- name: test
num_bytes: 149430294
num_examples: 10000
download_size: 703086720
dataset_size: 2057908365
---
# Dataset Card for "LCC_python"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/model-card-2023-07-28 | ---
dataset_info:
features:
- name: hub_id
dtype: string
- name: library_name
dtype: string
- name: model_card_text
dtype: string
splits:
- name: train
num_bytes: 331163660
num_examples: 271078
download_size: 105104933
dataset_size: 331163660
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "model-card-2023-07-28"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lmqg/qa_squadshifts | ---
license: cc-by-4.0
pretty_name: SQuADShifts
language: en
multilinguality: monolingual
size_categories: 1k<n<10k
source_datasets:
- extended|wikipedia
task_categories:
- question-answering
task_ids:
- extractive-qa
---
# Dataset Card for "lmqg/qa_squadshifts"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2004.14444](https://arxiv.org/abs/2004.14444)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is SQuADShifts dataset with custom split of training/validation/test following [lmqg/qg_squadshifts](https://huggingface.co/datasets/lmqg/qg_squadshifts).
### Supported Tasks and Leaderboards
* `question-answering`
### Languages
English (en)
## Dataset Structure
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `id`: a `string` feature of id
- `title`: a `string` feature of title of the paragraph
- `context`: a `string` feature of paragraph
- `question`: a `string` feature of question
- `answers`: a `json` feature of answers
### Data Splits
| name |train | valid | test |
|-------------|------:|------:|-----:|
|default (all)|9209|6283 |18,844|
| amazon |3295|1648|4942|
| new_wiki |2646|1323|3969|
| nyt |3355|1678|5032|
| reddit |3268|1634|4901|
## Citation Information
```
@inproceedings{miller2020effect,
title={The effect of natural distribution shift on question answering models},
author={Miller, John and Krauth, Karl and Recht, Benjamin and Schmidt, Ludwig},
booktitle={International Conference on Machine Learning},
pages={6905--6916},
year={2020},
organization={PMLR}
}
``` |
CodeT5SmallCAPS/CAPS_Python | ---
dataset_info:
features:
- name: code
dtype: string
- name: code_sememe
dtype: string
- name: token_type
dtype: string
- name: code_dependency
dtype: string
splits:
- name: train
num_bytes: 1702629853.216785
num_examples: 362342
- name: val
num_bytes: 212829906.3916075
num_examples: 45293
- name: test
num_bytes: 212829906.3916075
num_examples: 45293
download_size: 796759125
dataset_size: 2128289666.0
---
# Dataset Card for "DeepCC_Python"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/AlpacaToxicQA | ---
tags:
- not-for-all-audiences
---
Use only for Alignment research. NOETI is not responsible for what you might do with it. |
jainabh/smart-contract-w-Slither | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: contract_source
dtype: string
- name: malicious
dtype: bool
- name: mod_source
dtype: string
- name: version
dtype: string
- name: Slither Detectors
dtype: string
- name: confidence
dtype: string
- name: impact
dtype: string
splits:
- name: train
num_bytes: 63905104
num_examples: 2000
download_size: 14239745
dataset_size: 63905104
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "smart-contract-w-Slither"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Staticaaaowplf/Ai_-Pedro_-Luiz | ---
license: apache-2.0
---
|
aleh/aims_4 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 630264241.0
num_examples: 25
download_size: 141922879
dataset_size: 630264241.0
---
# Dataset Card for "aims_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kirito3/Minhavoz | ---
license: apache-2.0
---
|
anzorq/kbd_speech_preprocessed_for_whisper_training | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 17767327304
num_examples: 18499
- name: test
num_bytes: 1974680696
num_examples: 2056
download_size: 1602763861
dataset_size: 19742008000
---
# Dataset Card for "kbd_speech_preprocessed_for_whisper_training"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hlapp/ubergraph | ---
license: bsd-3-clause
---
|
medmac01/OpenHermes-AR-300K | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: model_name
dtype: string
- name: custom_instruction
dtype: bool
- name: idx
dtype: float64
- name: topic
dtype: string
- name: language
dtype: string
- name: conversations
dtype: string
- name: system_prompt
dtype: string
- name: avatarUrl
dtype: float64
- name: hash
dtype: float64
- name: category
dtype: float64
- name: id
dtype: string
- name: model
dtype: float64
- name: views
dtype: float64
- name: skip_prompt_formatting
dtype: float64
- name: title
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 606005587
num_examples: 300022
download_size: 249268422
dataset_size: 606005587
---
# Dataset Card for "OpenHermes-AR-300K.csv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_aint_before_main | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 5335
num_examples: 39
- name: test
num_bytes: 9823
num_examples: 68
- name: train
num_bytes: 134919
num_examples: 1189
download_size: 75435
dataset_size: 150077
---
# Dataset Card for "MULTI_VALUE_sst2_aint_before_main"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
goethe0101/GWP | ---
license: apache-2.0
---
# Dataset Card for [GWP]
|
lubnaa25/Madima23 | ---
license: afl-3.0
---
The current work is published in the paper "A Comparative Analysis of Sensor-, Geometry-, and Neural-Based Methods for Food Volume Estimation" (https://doi.org/10.1145/3607828.3617794).
The dataset consists of two folders, one for the plastic and one for the real food. In every meal folder there are the following subfolders for distances 40 and 60 cm:
- The "GT_RECAP": Containing the point clouds for each food item, and the total meal.
- The "INTELRS": The original RGB image (image_1_original.jpg) and original depth image (image_1_original_depth.png) captured by the Intel RealSense D455 sensor, the segmented food items (mask.png) and the information about each segmented food item (details.txt).
- The "LIDAR": The original RGB image (image_1_original.jpg) and original depth image (image_1_original_depth.png) captured by the iPhone 14 Pro integrated with a LiDAR sensor, the scaled depth (real_depth.npy), the segmented food items (mask.png) and the information about each segmented food item (details.txt).
- The "STEREO": The original RGB images from 90 and 75 degrees (image_1_original.jpg, image_2_original.jpg) captured by the OnePlus 7 Pro, the segmented food items (mask.png, mask2.png), the gravity data (Gravity_image_1.json, Gravity_image_2.json) and the information about each segmented food item (details.txt).
- The "ZOE": The original RGB image (image_1_original.jpg) captured by the iPhone 14 Pro , the segmented food items (merged_mask.png) and the information about each segmented food item (details.txt).
Additionally, you can find the "Volume GT Meals_MADIMA2023.xlsx" that contain all the ground truth volumes and the "gocarb.jpg" image of the reference card with actual Size: 8.5cm*5.5cm. |
AdapterOcean/augmentatio-standardized_cluster_5_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6032703
num_examples: 5848
download_size: 2577283
dataset_size: 6032703
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "augmentatio-standardized_cluster_5_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
acampi/luke_combs | ---
license: apache-2.0
---
|
MinervaAI/Aesir-Preview | ---
license: apache-2.0
tags:
- not-for-all-audiences
- roleplay
- conversational
size_categories:
- 1K<n<10K
---
## MinervaAI is proud to present its very first public dataset release: Aesir-Preview
⚠️ **WARNING:** This is a preview dataset and may not reflect the content or quality of the final result. Use discretion and caution when accessing or utilizing this data.
Contained within this ShareGPT-based dataset are 1000 fully synthetic roleplay dialogue generations between an anonymous user and character cards from Chub.ai, the latter which were carefully checked, corrected and improved upon.
Each generation is the results of dozens of automated validations, corrections and manual curations to ensure they're of the highest quality that can be achieved within the limitations of the model used, which was GPT 3.5 Instruct.
⚠️ **NSFW WARNING:** This dataset is filled to the brim with NSFW data and contains a wide variety of erotic themes, potentially disturbing scenes and very strong language. Models trained on this data will therefore be heavily biased towards recreating such behaviour.
By Gryphe, Doctor Shotgun, IkariDev, Undi, Mixel, [Chat Error], kubernetes_bad and StefanGliga
|
shidowake/FreedomIntelligence_alpaca-gpt4-japanese_subset_split_8 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 4863217.322740098
num_examples: 4997
download_size: 2507875
dataset_size: 4863217.322740098
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pushpdeep/fake_news_combined | ---
license: apache-2.0
---
**Label Description**
0 : Fake,
1 : Real |
autoevaluate/autoeval-staging-eval-project-cnn_dailymail-c1b20bff-12875717 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: google/pegasus-cnn_dailymail
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-cnn_dailymail
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@grapplerulrich](https://huggingface.co/grapplerulrich) for evaluating this model. |
dmrau/cqadupstack-mathematica-qrels | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 34691
num_examples: 1358
download_size: 0
dataset_size: 34691
---
# Dataset Card for "cqadupstack-mathematica-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sazirarrwth99/web_nlg_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: old_id
dtype: string
- name: text
dtype: string
- name: category
dtype: string
- name: size
dtype: string
- name: shape
dtype: string
- name: shape_type
dtype: string
- name: triplets
dtype: string
- name: question_entities
dtype: string
- name: superclasses
dtype: string
- name: triplets_subgraph
dtype: string
- name: superclasses_new_entities
dtype: string
- name: possible_classes
dtype: string
- name: possible_classes_no_comment
dtype: string
- name: possible_object_properties
dtype: string
- name: possible_object_properties_no_comment
dtype: string
- name: possible_data_properties
dtype: string
- name: possible_data_properties_no_comment
dtype: string
splits:
- name: train
num_bytes: 4035959
num_examples: 1298
download_size: 759726
dataset_size: 4035959
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
emrecan/stsb-mt-turkish | ---
language_creators:
- machine-generated
language:
- tr
size_categories:
- 1K<n<10K
source_datasets:
- extended|other-sts-b
task_categories:
- text-classification
task_ids:
- semantic-similarity-scoring
- text-scoring
---
# STSb Turkish
Semantic textual similarity dataset for the Turkish language. It is a machine translation (Azure) of the [STSb English](http://ixa2.si.ehu.eus/stswiki/index.php/STSbenchmark) dataset. This dataset is not reviewed by expert human translators.
Uploaded from [this repository](https://github.com/emrecncelik/sts-benchmark-tr). |
tcrouzet/tcrouzet_blog | ---
license: apache-2.0
language:
- fr
tags:
- instruction-finetuning
pretty_name: tcrouzet blog
task_categories:
- text-generation
---
|
datasets-examples/doc-unsupported-1 | ---
configs:
- config_name: csv
data_files: "*.csv"
- config_name: tsv
data_files: "*.tsv"
- config_name: json
data_files: "*.json"
- config_name: jsonl
data_files: "*.jsonl"
- config_name: txt
data_files: "*.txt"
size_categories:
- n<1K
---
# [doc] formats 1
This dataset contains files for a collection of supported formats, each of which is loaded in a different config (see the YAML field `configs`).
|
AlekseyKorshuk/evol-codealpaca-v1-sft | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 99287969
num_examples: 39882
download_size: 49257160
dataset_size: 99287969
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
qgiaohc/twitter_dataset_1713187049 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 28632
num_examples: 64
download_size: 15127
dataset_size: 28632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v2](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T19:44:15.918763](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2/blob/main/results_2023-08-31T19%3A44%3A15.918763.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6793562660993889,\n\
\ \"acc_stderr\": 0.03184581364444873,\n \"acc_norm\": 0.6834576899716158,\n\
\ \"acc_norm_stderr\": 0.03181820263146339,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6450685339919277,\n\
\ \"mc2_stderr\": 0.015210507246763325\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.013936809212158303,\n\
\ \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688067\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6491734714200359,\n\
\ \"acc_stderr\": 0.004762534245488399,\n \"acc_norm\": 0.8536148177653854,\n\
\ \"acc_norm_stderr\": 0.003527695149823521\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7773584905660378,\n \"acc_stderr\": 0.025604233470899095,\n\
\ \"acc_norm\": 0.7773584905660378,\n \"acc_norm_stderr\": 0.025604233470899095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6553191489361702,\n \"acc_stderr\": 0.031068985963122145,\n\
\ \"acc_norm\": 0.6553191489361702,\n \"acc_norm_stderr\": 0.031068985963122145\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.022037217340267833,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.022037217340267833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n\
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7563025210084033,\n \"acc_stderr\": 0.02788682807838055,\n \
\ \"acc_norm\": 0.7563025210084033,\n \"acc_norm_stderr\": 0.02788682807838055\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634612,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634612\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8825688073394495,\n \"acc_stderr\": 0.013802780227377355,\n \"\
acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.013802780227377355\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.0208711184555521,\n \"acc_norm\"\
: 0.9019607843137255,\n \"acc_norm_stderr\": 0.0208711184555521\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \"\
acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n\
\ \"acc_stderr\": 0.028930413120910888,\n \"acc_norm\": 0.7533632286995515,\n\
\ \"acc_norm_stderr\": 0.028930413120910888\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8582375478927203,\n\
\ \"acc_stderr\": 0.012473289071272051,\n \"acc_norm\": 0.8582375478927203,\n\
\ \"acc_norm_stderr\": 0.012473289071272051\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5463687150837989,\n\
\ \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.5463687150837989,\n\
\ \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n\
\ \"acc_stderr\": 0.02429659403476343,\n \"acc_norm\": 0.7588424437299035,\n\
\ \"acc_norm_stderr\": 0.02429659403476343\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445796,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445796\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5319148936170213,\n \"acc_stderr\": 0.02976667507587387,\n \
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.02976667507587387\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5501955671447197,\n\
\ \"acc_stderr\": 0.012705721498564969,\n \"acc_norm\": 0.5501955671447197,\n\
\ \"acc_norm_stderr\": 0.012705721498564969\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625166,\n\
\ \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625166\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7287581699346405,\n \"acc_stderr\": 0.017986615304030316,\n \
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.017986615304030316\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073153,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073153\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6450685339919277,\n\
\ \"mc2_stderr\": 0.015210507246763325\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|arc:challenge|25_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hellaswag|10_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T19:44:15.918763.parquet'
- config_name: results
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- results_2023-08-31T19:44:15.918763.parquet
- split: latest
path:
- results_2023-08-31T19:44:15.918763.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v2](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T19:44:15.918763](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2/blob/main/results_2023-08-31T19%3A44%3A15.918763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6793562660993889,
"acc_stderr": 0.03184581364444873,
"acc_norm": 0.6834576899716158,
"acc_norm_stderr": 0.03181820263146339,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6450685339919277,
"mc2_stderr": 0.015210507246763325
},
"harness|arc:challenge|25": {
"acc": 0.6501706484641638,
"acc_stderr": 0.013936809212158303,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688067
},
"harness|hellaswag|10": {
"acc": 0.6491734714200359,
"acc_stderr": 0.004762534245488399,
"acc_norm": 0.8536148177653854,
"acc_norm_stderr": 0.003527695149823521
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7773584905660378,
"acc_stderr": 0.025604233470899095,
"acc_norm": 0.7773584905660378,
"acc_norm_stderr": 0.025604233470899095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6553191489361702,
"acc_stderr": 0.031068985963122145,
"acc_norm": 0.6553191489361702,
"acc_norm_stderr": 0.031068985963122145
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267833,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284357,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223157,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223157
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7563025210084033,
"acc_stderr": 0.02788682807838055,
"acc_norm": 0.7563025210084033,
"acc_norm_stderr": 0.02788682807838055
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634612,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634612
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8825688073394495,
"acc_stderr": 0.013802780227377355,
"acc_norm": 0.8825688073394495,
"acc_norm_stderr": 0.013802780227377355
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.0208711184555521,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.0208711184555521
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7533632286995515,
"acc_stderr": 0.028930413120910888,
"acc_norm": 0.7533632286995515,
"acc_norm_stderr": 0.028930413120910888
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8582375478927203,
"acc_stderr": 0.012473289071272051,
"acc_norm": 0.8582375478927203,
"acc_norm_stderr": 0.012473289071272051
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5463687150837989,
"acc_stderr": 0.016650437588269076,
"acc_norm": 0.5463687150837989,
"acc_norm_stderr": 0.016650437588269076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.02429659403476343,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.02429659403476343
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445796,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.02976667507587387,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.02976667507587387
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5501955671447197,
"acc_stderr": 0.012705721498564969,
"acc_norm": 0.5501955671447197,
"acc_norm_stderr": 0.012705721498564969
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.017986615304030316,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.017986615304030316
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073153,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6450685339919277,
"mc2_stderr": 0.015210507246763325
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CVasNLPExperiments/VQAv2_sample_validation_text_davinci_003_mode_T_A_D_PNP_NO_FILTER_C_Q_rices_ns_200 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 255182
num_examples: 200
download_size: 126747
dataset_size: 255182
---
# Dataset Card for "VQAv2_sample_validation_text_davinci_003_mode_T_A_D_PNP_NO_FILTER_C_Q_rices_ns_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hellotaosir/dreambooth_materials | ---
license: openrail
---
|
thanhduycao/data_synthesis_v1 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: transcription
dtype: string
- name: old_transcription
dtype: string
splits:
- name: train
num_bytes: 10125909
num_examples: 20
download_size: 2434457
dataset_size: 10125909
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_synthesis_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rntc/blurb_bc5disease_a-0-tm | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: type
dtype: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B
'2': I
splits:
- name: train
num_bytes: 16267137
num_examples: 4560
- name: validation
num_bytes: 15854894
num_examples: 4581
- name: test
num_bytes: 16855267
num_examples: 4797
download_size: 6927880
dataset_size: 48977298
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
bartoszmaj/nouns_four | ---
dataset_info:
features:
- name: nouns
sequence: string
splits:
- name: train
num_bytes: 442756142
num_examples: 1600698
download_size: 123686926
dataset_size: 442756142
---
# Dataset Card for "nouns_four"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ianwatts/water_bottle_db | ---
license: mit
---
|
ZhankuiHe/redial_cikm | ---
task_categories:
- conversational
language:
- en
tags:
- recommendation
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Summary
A dataset consisting of over 10,000 conversations centered around the theme of providing movie recommendations.
## Languages
English
## More Information
This is the [ReDIAL](https://arxiv.org/abs/1812.07617) dataset adapted from the Conversational Recommender System toolkit [CRSLab](https://github.com/RUCAIBox/CRSLab#Datasets). |
mbgenai/bunny_speech_test | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 11657
num_examples: 10
download_size: 10675
dataset_size: 11657
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jan-hq/open_tora_binarized | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 206313539.13326368
num_examples: 118848
- name: test
num_bytes: 22924883.866736334
num_examples: 13206
download_size: 56222419
dataset_size: 229238423.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ncbi_disease | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: ncbi-disease-1
pretty_name: NCBI Disease
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-Disease
'2': I-Disease
config_name: ncbi_disease
splits:
- name: train
num_bytes: 2355516
num_examples: 5433
- name: validation
num_bytes: 413900
num_examples: 924
- name: test
num_bytes: 422842
num_examples: 941
download_size: 1546492
dataset_size: 3192258
train-eval-index:
- config: ncbi_disease
task: token-classification
task_id: multi_class_classification
splits:
train_split: train
eval_split: test
col_mapping:
tokens: text
ner_tags: target
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 macro
args:
average: macro
- type: f1
name: F1 micro
args:
average: micro
- type: f1
name: F1 weighted
args:
average: weighted
- type: precision
name: Precision macro
args:
average: macro
- type: precision
name: Precision micro
args:
average: micro
- type: precision
name: Precision weighted
args:
average: weighted
- type: recall
name: Recall macro
args:
average: macro
- type: recall
name: Recall micro
args:
average: micro
- type: recall
name: Recall weighted
args:
average: weighted
---
# Dataset Card for NCBI Disease
## Table of Contents
- [Dataset Card for NCBI Disease](#dataset-card-for-ncbi-disease)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [NCBI](https://www.ncbi.nlm.nih.gov/research/bionlp/Data/disease)
- **Repository:** [Github](https://github.com/spyysalo/ncbi-disease)
- **Paper:** [NCBI disease corpus: A resource for disease name recognition and concept normalization](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3951655)
- **Leaderboard:** [Named Entity Recognition on NCBI-disease](https://paperswithcode.com/sota/named-entity-recognition-ner-on-ncbi-disease)
- **Point of Contact:** [email](zhiyong.lu@nih.gov)
### Dataset Summary
This dataset contains the disease name and concept annotations of the NCBI disease corpus, a collection of 793 PubMed abstracts fully annotated at the mention and concept level to serve as a research resource for the biomedical natural language processing community.
### Supported Tasks and Leaderboards
Named Entity Recognition: [Leaderboard](https://paperswithcode.com/sota/named-entity-recognition-ner-on-ncbi-disease)
### Languages
The text in the dataset is in English. The associated BCP-47 code is `en`.
## Dataset Structure
### Data Instances
Instances of the dataset contain an array of `tokens`, `ner_tags` and an `id`. An example of an instance of the dataset:
```
{
'tokens': ['Identification', 'of', 'APC2', ',', 'a', 'homologue', 'of', 'the', 'adenomatous', 'polyposis', 'coli', 'tumour', 'suppressor', '.'],
'ner_tags': [0, 0, 0, 0, 0, 0, 0, 0, 1, 2, 2, 2, 0, 0],
'id': '0'
}
```
### Data Fields
- `id`: Sentence identifier.
- `tokens`: Array of tokens composing a sentence.
- `ner_tags`: Array of tags, where `0` indicates no disease mentioned, `1` signals the first token of a disease and `2` the subsequent disease tokens.
### Data Splits
The data is split into a train (5433 instances), validation (924 instances) and test set (941 instances).
## Dataset Creation
### Curation Rationale
The goal of the dataset consists on improving the state-of-the-art in disease name recognition and normalization research, by providing a high-quality gold standard thus enabling the development of machine-learning based approaches for such tasks.
### Source Data
#### Initial Data Collection and Normalization
The dataset consists on abstracts extracted from PubMed.
#### Who are the source language producers?
The source language producers are the authors of publication abstracts hosted in PubMed.
### Annotations
#### Annotation process
Each PubMed abstract was manually annotated by two annotators with disease mentions and their corresponding concepts in Medical Subject Headings (MeSH®) or Online Mendelian Inheritance in Man (OMIM®). Manual curation was performed using PubTator, which allowed the use of pre-annotations as a pre-step to manual annotations. Fourteen annotators were randomly paired and differing annotations were discussed for reaching a consensus in two annotation phases. Finally, all results were checked against annotations of the rest of the corpus to assure corpus-wide consistency.
#### Who are the annotators?
The annotator group consisted of 14 people with backgrounds in biomedical informatics research and experience in biomedical text corpus annotation.
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
Information encoded in natural language in biomedical literature publications is only useful if efficient and reliable ways of accessing and analyzing that information are available. Natural language processing and text mining tools are therefore essential for extracting valuable information. This dataset provides an annotated corpora that can be used to develop highly effective tools to automatically detect central biomedical concepts such as diseases.
### Discussion of Biases
To avoid annotator bias, pairs of annotators were chosen randomly for each set, so that each pair of annotators overlapped for at most two sets.
### Other Known Limitations
A handful of disease concepts were discovered that were not included in MEDIC. For those, we decided to include the appropriate OMIM identifiers.
In addition, certain disease mentions were found to not be easily represented using the standard categorizations.
Also, each PMID document was pre-annotated using the Inference Method developed for disease name normalization, which properly handles abbreviation recognition, robust string matching, etc. As such, human annotators were given the pre-annotated documents as a starting point and allowed to see each pre-annotation with a computed confidence.
## Additional Information
### Dataset Curators
Rezarta Islamaj Doğan, Robert Leaman, Zhiyong Lu
### Licensing Information
```
PUBLIC DOMAIN NOTICE
This work is a "United States Government Work" under the terms of the
United States Copyright Act. It was written as part of the authors'
official duties as a United States Government employee and thus cannot
be copyrighted within the United States. The data is freely available
to the public for use. The National Library of Medicine and the
U.S. Government have not placed any restriction on its use or
reproduction.
Although all reasonable efforts have been taken to ensure the accuracy
and reliability of the data and its source code, the NLM and the
U.S. Government do not and cannot warrant the performance or results
that may be obtained by using it. The NLM and the U.S. Government
disclaim all warranties, express or implied, including warranties of
performance, merchantability or fitness for any particular purpose.
Please cite the authors in any work or product based on this material:
An improved corpus of disease mentions in PubMed citations
http://aclweb.org/anthology-new/W/W12/W12-2411.pdf
NCBI Disease Corpus: A Resource for Disease Name Recognition and
Normalization http://www.ncbi.nlm.nih.gov/pubmed/24393765
Disease Name Normalization with Pairwise Learning to Rank
http://www.ncbi.nlm.nih.gov/pubmed/23969135
```
### Citation Information
```
@article{dougan2014ncbi,
title={NCBI disease corpus: a resource for disease name recognition and concept normalization},
author={Do{\u{g}}an, Rezarta Islamaj and Leaman, Robert and Lu, Zhiyong},
journal={Journal of biomedical informatics},
volume={47},
pages={1--10},
year={2014},
publisher={Elsevier}
}
```
### Contributions
Thanks to [@edugp](https://github.com/edugp) for adding this dataset. |
FINNUMBER/FINCH_TRAIN_QA_EQA_400_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2166841
num_examples: 400
download_size: 1175765
dataset_size: 2166841
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
McSpicyWithMilo/target-elements-0.2split | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: target_element
dtype: string
- name: instruction_type
dtype: string
splits:
- name: train
num_bytes: 36440.0
num_examples: 320
- name: test
num_bytes: 9110.0
num_examples: 80
download_size: 24201
dataset_size: 45550.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "target-elements-0.2split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VictorJsy/College-Entrance-English-Examination-Listening-Part | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-all-hal-7b-ep3 | ---
pretty_name: Evaluation run of luffycodes/vicuna-class-shishya-all-hal-7b-ep3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [luffycodes/vicuna-class-shishya-all-hal-7b-ep3](https://huggingface.co/luffycodes/vicuna-class-shishya-all-hal-7b-ep3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-all-hal-7b-ep3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T14:43:13.038199](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-all-hal-7b-ep3/blob/main/results_2023-12-16T14-43-13.038199.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5101563719470747,\n\
\ \"acc_stderr\": 0.034174127741758806,\n \"acc_norm\": 0.5187563327932377,\n\
\ \"acc_norm_stderr\": 0.035034297734345236,\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.4483407078884665,\n\
\ \"mc2_stderr\": 0.015114510843263715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42150170648464164,\n \"acc_stderr\": 0.014430197069326014,\n\
\ \"acc_norm\": 0.454778156996587,\n \"acc_norm_stderr\": 0.014551507060836355\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5836486755626369,\n\
\ \"acc_stderr\": 0.004919457850104234,\n \"acc_norm\": 0.7720573590918144,\n\
\ \"acc_norm_stderr\": 0.004186480645315569\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n \
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101803,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101803\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5612903225806452,\n \"acc_stderr\": 0.02822949732031722,\n \"\
acc_norm\": 0.5612903225806452,\n \"acc_norm_stderr\": 0.02822949732031722\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4039408866995074,\n \"acc_stderr\": 0.034524539038220406,\n \"\
acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.034524539038220406\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512568,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512568\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448663,\n\
\ \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448663\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7027522935779816,\n \"acc_stderr\": 0.019595707224643523,\n \"\
acc_norm\": 0.7027522935779816,\n \"acc_norm_stderr\": 0.019595707224643523\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289202,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289202\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.02742100729539292,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.02742100729539292\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6896551724137931,\n\
\ \"acc_stderr\": 0.016543785026048315,\n \"acc_norm\": 0.6896551724137931,\n\
\ \"acc_norm_stderr\": 0.016543785026048315\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.026720034380514995,\n\
\ \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.026720034380514995\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3016759776536313,\n\
\ \"acc_stderr\": 0.015350767572220286,\n \"acc_norm\": 0.3016759776536313,\n\
\ \"acc_norm_stderr\": 0.015350767572220286\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02818059632825929,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02818059632825929\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379424,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379424\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281285,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281285\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37353324641460234,\n\
\ \"acc_stderr\": 0.012354994823515266,\n \"acc_norm\": 0.37353324641460234,\n\
\ \"acc_norm_stderr\": 0.012354994823515266\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150117,\n \
\ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150117\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.03078905113903081,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.03078905113903081\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.03294118479054095,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.03294118479054095\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.4483407078884665,\n\
\ \"mc2_stderr\": 0.015114510843263715\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.024260803639120546,\n \
\ \"acc_stderr\": 0.004238007900001396\n }\n}\n```"
repo_url: https://huggingface.co/luffycodes/vicuna-class-shishya-all-hal-7b-ep3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|arc:challenge|25_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|gsm8k|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hellaswag|10_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T14-43-13.038199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T14-43-13.038199.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- '**/details_harness|winogrande|5_2023-12-16T14-43-13.038199.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T14-43-13.038199.parquet'
- config_name: results
data_files:
- split: 2023_12_16T14_43_13.038199
path:
- results_2023-12-16T14-43-13.038199.parquet
- split: latest
path:
- results_2023-12-16T14-43-13.038199.parquet
---
# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-all-hal-7b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [luffycodes/vicuna-class-shishya-all-hal-7b-ep3](https://huggingface.co/luffycodes/vicuna-class-shishya-all-hal-7b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-all-hal-7b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T14:43:13.038199](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-all-hal-7b-ep3/blob/main/results_2023-12-16T14-43-13.038199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5101563719470747,
"acc_stderr": 0.034174127741758806,
"acc_norm": 0.5187563327932377,
"acc_norm_stderr": 0.035034297734345236,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.4483407078884665,
"mc2_stderr": 0.015114510843263715
},
"harness|arc:challenge|25": {
"acc": 0.42150170648464164,
"acc_stderr": 0.014430197069326014,
"acc_norm": 0.454778156996587,
"acc_norm_stderr": 0.014551507060836355
},
"harness|hellaswag|10": {
"acc": 0.5836486755626369,
"acc_stderr": 0.004919457850104234,
"acc_norm": 0.7720573590918144,
"acc_norm_stderr": 0.004186480645315569
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101803,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101803
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.02822949732031722,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.02822949732031722
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.034524539038220406,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.034524539038220406
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512568,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512568
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.025317649726448663,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.025317649726448663
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47478991596638653,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.47478991596638653,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7027522935779816,
"acc_stderr": 0.019595707224643523,
"acc_norm": 0.7027522935779816,
"acc_norm_stderr": 0.019595707224643523
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289202,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289202
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.02742100729539292,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.02742100729539292
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.016543785026048315,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.016543785026048315
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.026720034380514995,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.026720034380514995
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3016759776536313,
"acc_stderr": 0.015350767572220286,
"acc_norm": 0.3016759776536313,
"acc_norm_stderr": 0.015350767572220286
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281285,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281285
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37353324641460234,
"acc_stderr": 0.012354994823515266,
"acc_norm": 0.37353324641460234,
"acc_norm_stderr": 0.012354994823515266
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49836601307189543,
"acc_stderr": 0.020227726838150117,
"acc_norm": 0.49836601307189543,
"acc_norm_stderr": 0.020227726838150117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.03078905113903081,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.03078905113903081
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.03294118479054095,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.03294118479054095
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.4483407078884665,
"mc2_stderr": 0.015114510843263715
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638263
},
"harness|gsm8k|5": {
"acc": 0.024260803639120546,
"acc_stderr": 0.004238007900001396
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3 | ---
pretty_name: Evaluation run of nathan0/mpt_delta_tuned_model_v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nathan0/mpt_delta_tuned_model_v3](https://huggingface.co/nathan0/mpt_delta_tuned_model_v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T18:53:57.396321](https://huggingface.co/datasets/open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3/blob/main/results_2023-08-29T18%3A53%3A57.396321.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28112521141201186,\n\
\ \"acc_stderr\": 0.032405505734312466,\n \"acc_norm\": 0.2851491508040904,\n\
\ \"acc_norm_stderr\": 0.03239478354615427,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.35460998683456907,\n\
\ \"mc2_stderr\": 0.013780749850644137\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.454778156996587,\n \"acc_stderr\": 0.014551507060836353,\n\
\ \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5777733519219279,\n\
\ \"acc_stderr\": 0.004929048482760455,\n \"acc_norm\": 0.7639912368054173,\n\
\ \"acc_norm_stderr\": 0.004237598142007246\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768076,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768076\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.029379170464124825,\n\
\ \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.029379170464124825\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281334,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020514,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020514\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904276,\n \"\
acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904276\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"\
acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.023119362758232287,\n\
\ \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.023119362758232287\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277733,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277733\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567978,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567978\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27889908256880735,\n \"acc_stderr\": 0.019227468876463514,\n \"\
acc_norm\": 0.27889908256880735,\n \"acc_norm_stderr\": 0.019227468876463514\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18518518518518517,\n \"acc_stderr\": 0.026491914727355147,\n \"\
acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.026491914727355147\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\"\
: 0.2647058823529412,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598035,\n \"\
acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n\
\ \"acc_stderr\": 0.04524596007030049,\n \"acc_norm\": 0.32407407407407407,\n\
\ \"acc_norm_stderr\": 0.04524596007030049\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.029745048572674033,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.029745048572674033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29757343550446996,\n\
\ \"acc_stderr\": 0.01634911191290943,\n \"acc_norm\": 0.29757343550446996,\n\
\ \"acc_norm_stderr\": 0.01634911191290943\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.025738854797818702,\n\
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.025738854797818702\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\
\ \"acc_stderr\": 0.02567025924218894,\n \"acc_norm\": 0.2861736334405145,\n\
\ \"acc_norm_stderr\": 0.02567025924218894\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169927,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169927\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n\
\ \"acc_stderr\": 0.01128503316555127,\n \"acc_norm\": 0.26597131681877445,\n\
\ \"acc_norm_stderr\": 0.01128503316555127\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987862,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987862\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.35460998683456907,\n\
\ \"mc2_stderr\": 0.013780749850644137\n }\n}\n```"
repo_url: https://huggingface.co/nathan0/mpt_delta_tuned_model_v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|arc:challenge|25_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|arc:challenge|25_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hellaswag|10_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hellaswag|10_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T18:53:57.396321.parquet'
- config_name: results
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- results_2023-08-29T10:14:18.725363.parquet
- split: 2023_08_29T18_53_57.396321
path:
- results_2023-08-29T18:53:57.396321.parquet
- split: latest
path:
- results_2023-08-29T18:53:57.396321.parquet
---
# Dataset Card for Evaluation run of nathan0/mpt_delta_tuned_model_v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nathan0/mpt_delta_tuned_model_v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nathan0/mpt_delta_tuned_model_v3](https://huggingface.co/nathan0/mpt_delta_tuned_model_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T18:53:57.396321](https://huggingface.co/datasets/open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3/blob/main/results_2023-08-29T18%3A53%3A57.396321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28112521141201186,
"acc_stderr": 0.032405505734312466,
"acc_norm": 0.2851491508040904,
"acc_norm_stderr": 0.03239478354615427,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.35460998683456907,
"mc2_stderr": 0.013780749850644137
},
"harness|arc:challenge|25": {
"acc": 0.454778156996587,
"acc_stderr": 0.014551507060836353,
"acc_norm": 0.5059726962457338,
"acc_norm_stderr": 0.014610348300255795
},
"harness|hellaswag|10": {
"acc": 0.5777733519219279,
"acc_stderr": 0.004929048482760455,
"acc_norm": 0.7639912368054173,
"acc_norm_stderr": 0.004237598142007246
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678318,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678318
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768076,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768076
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.029379170464124825,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.029379170464124825
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281334,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633345,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633345
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020514,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020514
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904276,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904276
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.023119362758232287,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.023119362758232287
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277733,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277733
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567978,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567978
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27889908256880735,
"acc_stderr": 0.019227468876463514,
"acc_norm": 0.27889908256880735,
"acc_norm_stderr": 0.019227468876463514
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.026491914727355147,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.026491914727355147
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.04524596007030049,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.04524596007030049
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.029745048572674033,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.029745048572674033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29757343550446996,
"acc_stderr": 0.01634911191290943,
"acc_norm": 0.29757343550446996,
"acc_norm_stderr": 0.01634911191290943
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.025738854797818702,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.025738854797818702
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.02567025924218894,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.02567025924218894
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169927,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169927
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26597131681877445,
"acc_stderr": 0.01128503316555127,
"acc_norm": 0.26597131681877445,
"acc_norm_stderr": 0.01128503316555127
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987862,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.35460998683456907,
"mc2_stderr": 0.013780749850644137
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
marcel-gohsen/dstc3 | ---
dataset_info:
features:
- name: session
dtype: string
- name: caller
dtype: string
- name: turn
dtype: int64
- name: transcript
dtype: string
- name: audio
dtype: audio
- name: intent
sequence: string
- name: slots
sequence: string
- name: cam
dtype: string
splits:
- name: test
num_bytes: 1665262025.24
num_examples: 18715
- name: seed
num_bytes: 9443304.0
num_examples: 109
download_size: 1097235525
dataset_size: 1674705329.24
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: seed
path: data/seed-*
---
|
mallam-ai/marx-engels | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: content
dtype: string
- name: title
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 20866538
num_examples: 1297
download_size: 11056454
dataset_size: 20866538
license: pddl
task_categories:
- text-generation
language:
- en
pretty_name: Marx and Engels Internet Archive
size_categories:
- 1K<n<10K
---
# Dataset Card for "marx-engels"
This dataset was generated by scraping https://www.marxists.org/archive/marx/index.htm
## Licensing Information
According to **marxists.org**, unless otherwise noted, texts in the archive are in the public domain.
See https://www.marxists.org/admin/janitor/faq.htm for further information. |
YoonSeul/legal_train_v1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 33357358
num_examples: 14716
download_size: 15578888
dataset_size: 33357358
---
# Dataset Card for "legal_train_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vezora/testtoxic | ---
license: apache-2.0
---
100% of credit for this dataset goes to Jon Durbin. This is the same dataset, converted to ultrafeedback binarized format, so it will work with hugging face allignment notebook DPO script.
original dataset: https://huggingface.co/datasets/unalignment/toxic-dpo-v0.1
this repo contains the script used to convert to the ultrafeedback format. Along with the dataset. |
marvmk/scalableMLDL1 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 5726523552
num_examples: 5962
- name: test
num_bytes: 2546311152
num_examples: 2651
download_size: 1397383253
dataset_size: 8272834704
---
# Dataset Card for "scalableMLDL1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JavaChu/eagle-ner-json | ---
task_categories:
- text-classification
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_more_much | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 7384
num_examples: 34
- name: test
num_bytes: 4692
num_examples: 21
- name: train
num_bytes: 22591
num_examples: 110
download_size: 33844
dataset_size: 34667
---
# Dataset Card for "MULTI_VALUE_stsb_more_much"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manaschauhan/Sales_data | ---
license: other
---
|
open-llm-leaderboard/details_aari1995__germeo-7b-laser | ---
pretty_name: Evaluation run of aari1995/germeo-7b-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aari1995/germeo-7b-laser](https://huggingface.co/aari1995/germeo-7b-laser) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aari1995__germeo-7b-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T18:27:49.824954](https://huggingface.co/datasets/open-llm-leaderboard/details_aari1995__germeo-7b-laser/blob/main/results_2024-01-13T18-27-49.824954.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6055285169834799,\n\
\ \"acc_stderr\": 0.033079665720799664,\n \"acc_norm\": 0.6095438527185658,\n\
\ \"acc_norm_stderr\": 0.03374506182230424,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5382753959859625,\n\
\ \"mc2_stderr\": 0.01572725969894502\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326023,\n\
\ \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6415056761601274,\n\
\ \"acc_stderr\": 0.004785781979354868,\n \"acc_norm\": 0.8281218880701056,\n\
\ \"acc_norm_stderr\": 0.003765034286153438\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.02499305339776482,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.02499305339776482\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097424,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097424\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154333,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808517,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808517\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.043012503996908764,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.043012503996908764\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n\
\ \"acc_stderr\": 0.014351702181636863,\n \"acc_norm\": 0.7982120051085568,\n\
\ \"acc_norm_stderr\": 0.014351702181636863\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.0251310002336479,\n\
\ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.0251310002336479\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n\
\ \"acc_stderr\": 0.015566392630057031,\n \"acc_norm\": 0.31731843575418994,\n\
\ \"acc_norm_stderr\": 0.015566392630057031\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.027184498909941613,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.027184498909941613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186807,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186807\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.012732398286190444,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.012732398286190444\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877746,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085634,\n \
\ \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085634\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072766,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072766\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5382753959859625,\n\
\ \"mc2_stderr\": 0.01572725969894502\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.01206892327890819\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4336618650492798,\n \
\ \"acc_stderr\": 0.013650728047064685\n }\n}\n```"
repo_url: https://huggingface.co/aari1995/germeo-7b-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|arc:challenge|25_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|gsm8k|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hellaswag|10_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-27-49.824954.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T18-27-49.824954.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- '**/details_harness|winogrande|5_2024-01-13T18-27-49.824954.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T18-27-49.824954.parquet'
- config_name: results
data_files:
- split: 2024_01_13T18_27_49.824954
path:
- results_2024-01-13T18-27-49.824954.parquet
- split: latest
path:
- results_2024-01-13T18-27-49.824954.parquet
---
# Dataset Card for Evaluation run of aari1995/germeo-7b-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aari1995/germeo-7b-laser](https://huggingface.co/aari1995/germeo-7b-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aari1995__germeo-7b-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:27:49.824954](https://huggingface.co/datasets/open-llm-leaderboard/details_aari1995__germeo-7b-laser/blob/main/results_2024-01-13T18-27-49.824954.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6055285169834799,
"acc_stderr": 0.033079665720799664,
"acc_norm": 0.6095438527185658,
"acc_norm_stderr": 0.03374506182230424,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5382753959859625,
"mc2_stderr": 0.01572725969894502
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326023,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670728
},
"harness|hellaswag|10": {
"acc": 0.6415056761601274,
"acc_stderr": 0.004785781979354868,
"acc_norm": 0.8281218880701056,
"acc_norm_stderr": 0.003765034286153438
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.02499305339776482,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.02499305339776482
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097424,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097424
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154333,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808517,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.043012503996908764,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.043012503996908764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636863,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636863
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.0251310002336479,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.0251310002336479
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.015566392630057031,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.015566392630057031
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.027184498909941613,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.027184498909941613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186807,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186807
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.012732398286190444,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.012732398286190444
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085634,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072766,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072766
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5382753959859625,
"mc2_stderr": 0.01572725969894502
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.01206892327890819
},
"harness|gsm8k|5": {
"acc": 0.4336618650492798,
"acc_stderr": 0.013650728047064685
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cessapellido/sample | ---
license: unknown
---
|
Guizmus/AnimeChanStyle | ---
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Guizmus/AnimeChanStyle/resolve/main/showcase_dataset.jpg"
---

This is the dataset used for making the model : https://huggingface.co/Guizmus/AnimeChanStyle
The images were made by the users of Stable Diffusion discord using CreativeML-OpenRail-M licenced models, in the intent to make this dataset.
90 pictures captioned with their content by hand, with the suffix ",AnimeChan Style"
The collection process was made public during less than a day, until enough variety was introduced to train through a Dreambooth method a style corresponding to the different members of this community
The picture captioned are available in [this zip file](https://huggingface.co/datasets/Guizmus/AnimeChanStyle/resolve/main/AnimeChanStyle%20v2.3.zip) |
huggingartists/lil-nas-x | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/lil-nas-x"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.182872 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/f50e1ac333da1f744f98eec38e44dd29.640x640x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/lil-nas-x">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Nas X</div>
<a href="https://genius.com/artists/lil-nas-x">
<div style="text-align: center; font-size: 14px;">@lil-nas-x</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/lil-nas-x).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-nas-x")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|111| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/lil-nas-x")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
TeeA/qa-plot-chart-generation | ---
dataset_info:
- config_name: chart-template
features:
- name: kind
dtype: string
- name: template
dtype: string
- name: parameters
sequence: string
splits:
- name: train
num_bytes: 3276
num_examples: 12
download_size: 5257
dataset_size: 3276
- config_name: default
features:
- name: db_id
dtype: string
- name: table_name
dtype: string
- name: column_names
sequence: string
- name: column_types
sequence: string
- name: gemini_response
dtype: string
splits:
- name: train
num_bytes: 1504495
num_examples: 876
download_size: 388892
dataset_size: 1504495
- config_name: official
features:
- name: db_id
dtype: string
- name: table_name
dtype: string
- name: column_names
sequence: string
- name: column_types
sequence: string
- name: questions
dtype: string
splits:
- name: train
num_bytes: 658094
num_examples: 542
download_size: 183976
dataset_size: 658094
configs:
- config_name: chart-template
data_files:
- split: train
path: chart-template/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: official
data_files:
- split: train
path: data/official/train-*
---
This dataset contains **2003 Vietnamese questions/requirements** about rendering/plotting multi kind of charts.
Their kind include:
```
{
'bar': 522,
'scatter': 284,
'line': 239,
'pie': 228,
'barh': 120,
'hist': 85,
'stacked-bar': 74,
'grouped-bar': 46,
'box': 37,
'heatmap': 26,
'bubble': 10,
'area': 6,
'stacked-barh': 5,
'multi-line': 3,
...
}
|
onealeph0cc/voting-agents-dataset-3 | ---
license: apache-2.0
dataset_info:
features:
- name: agent-name
dtype: string
- name: goal
dtype: string
- name: action
dtype: string
- name: vote-parsed
dtype: string
- name: vote-unparsed
dtype: string
- name: rate
dtype: float64
splits:
- name: train
num_bytes: 632111997
num_examples: 224615
download_size: 236901773
dataset_size: 632111997
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hippocrates/PubmedQA_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 701570152
num_examples: 211269
- name: valid
num_bytes: 159299
num_examples: 50
- name: test
num_bytes: 1622241
num_examples: 500
download_size: 359787344
dataset_size: 703351692
---
# Dataset Card for "PubmedQA_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
player1537/Bloom-560m-trained-on-Wizard-Vicuna-Uncensored | ---
dataset_info:
features:
- name: text
dtype: string
- name: tokens
sequence: int64
splits:
- name: train
num_bytes: 1115967006
num_examples: 86379
download_size: 375663823
dataset_size: 1115967006
---
# Dataset Card for "Bloom-560m-trained-on-Wizard-Vicuna-Uncensored"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_train50000_eval1000_dec | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: text
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: validation
num_bytes: 3184837
num_examples: 1000
- name: train
num_bytes: 169722340
num_examples: 50000
download_size: 35308668
dataset_size: 172907177
---
# Dataset Card for "squad_train50000_eval1000_dec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iamkaikai/PHOTO-ILLUSTRATION-ART | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 8015460.0
num_examples: 194
download_size: 7995170
dataset_size: 8015460.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "PHOTO-ILLUSTRATION-ART"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_100000_jannis_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2364400000
num_examples: 100000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 1611785428
dataset_size: 2600840000
---
# Dataset Card for "autotree_automl_100000_jannis_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
parkerhorn/omgcrack | ---
license: openrail
---
|
heliosprime/twitter_dataset_1713107765 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 15631
num_examples: 39
download_size: 16304
dataset_size: 15631
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713107765"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ekhlass/flutter_constraints | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.